Abstract:In the current research on multi-intention recognition models of natural language processing, information flow is only modeled from intention to slot, and the research on the interactive modeling of information flow from slot to intention is ignored. In addition, the task of intention recognition is easy to be confused, and other intention information is wrongly captured. The quality of contextual semantic feature extraction is poor and needs to be improved. In order to solve these problems, this study optimizes the current advanced typical GL-GIN (global-locally graph interaction network) model, explores the interactive modeling method from slot to intention, and uses the one-way attention layer from slot to intention. Furthermore, the study calculates the attention score from slot to intention, incorporates the attention mechanism, and uses the attention score from slot to intention as the connection weight. As a result, it can propagate and gather intention-related slot information and make the intention focus on the slot information that is relevant to it, so as to realize the bidirectional information flow of the multi-intention recognition model. At the same time, the BERT model is introduced as the coding layer to improve the quality of semantic feature extraction. Experiments show that the effect of this interactive modeling method is significantly improved. Compared with that of the original GL-GIN model, the overall accuracy of the new model on two public datasets (MixATIS and MixSNIPS) is increased by 5.2% and 9%, respectively.