###
计算机系统应用英文版:2022,31(6):167-174
←前一篇   |   后一篇→
本文二维码信息
码上扫一扫!
融合BERT和图注意力网络的多标签文本分类
(陆军工程大学 指挥控制工程学院, 南京 210007)
Incorporating BERT and Graph Attention Network for Multi-label Text Classification
(Command & Control Engineering College, Army Engineering University of PLA, Nanjing 210007, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 990次   下载 2164
Received:August 13, 2021    Revised:September 13, 2021
中文摘要: 多标签文本分类问题是多标签分类的重要分支之一, 现有的方法往往忽视了标签之间的关系, 难以有效利用标签之间存在着的相关性, 从而影响分类效果. 基于此, 本文提出一种融合BERT和图注意力网络的模型HBGA (hybrid BERT and graph attention): 首先, 利用BERT获得输入文本的上下文向量表示, 然后用Bi-LSTM和胶囊网络分别提取文本全局特征和局部特征, 通过特征融合方法构建文本特征向量, 同时, 通过图来建模标签之间的相关性, 用图中的节点表示标签的词嵌入, 通过图注意力网络将这些标签向量映射到一组相互依赖的分类器中, 最后, 将分类器应用到特征提取模块获得的文本特征进行端到端的训练, 综合分类器和特征信息得到最终的预测结果. 在Reuters-21578和AAPD两个数据集上面进行了对比实验, 实验结果表明, 本文模型在多标签文本分类任务上得到了有效的提升.
Abstract:The multi-label text classification is one of the important branches of multi-label classification. Existing methods often ignore the relationship between labels, and thus the correlation between labels can hardly be put into effective use, which affects the effects of classification. On this basis, this study proposes a hybrid BERT and graph attention (HBGA) model that fuses BERT and the graph attention network. First, BERT is employed to obtain the context vector representation of the input text, and Bi-LSTM and the capsule network are used to extract the global and local features of the text, respectively. Then, through feature fusion, text feature vectors are constructed. Meanwhile, the correlation between labels is modeled through graphs, and the nodes in graphs are used to represent the word embedding of the labels, and these label vectors are mapped to a set of interdependent classifiers through the graph attention network. Finally, the classifiers are applied to the text features obtained by the feature extraction module for end-to-end training. The classifier and feature information are integrated to obtain the final prediction results. Comparative experiments are performed on datasets Reuters-21578 and AAPD, and the experimental results indicate that the model in this study has been effectively improved on tasks of multi-label text classification.
文章编号:     中图分类号:    文献标志码:
基金项目:
引用文本:
郝超,裘杭萍,孙毅.融合BERT和图注意力网络的多标签文本分类.计算机系统应用,2022,31(6):167-174
HAO Chao,QIU Hang-Ping,SUN Yi.Incorporating BERT and Graph Attention Network for Multi-label Text Classification.COMPUTER SYSTEMS APPLICATIONS,2022,31(6):167-174