###
计算机系统应用英文版:2024,33(7):239-247
本文二维码信息
码上扫一扫!
基于多层次信息融合的多跳机器阅读理解
(1.长安大学 信息工程学院, 西安 710064;2.豫西工业集团 河南北方红阳机电有限公司, 南阳 474679)
Multi-hop Machine Reading Comprehension Based on Multi-level Information Fusion
(1.School of Information Engineering, Chang’an University, Xi’an 710064, China;2.Henan North Hongyang Mechanical and Electrical Co. Ltd., Western Henan Industrial Group, Nanyang 474679, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 216次   下载 682
Received:January 27, 2024    Revised:February 29, 2024
中文摘要: 以往机器阅读理解模型中存在文本特征提取单一, 文本和问题的交互信息不全面等问题, 导致模型不能充分对文本进行理解, 本文提出了一种多层次信息融合的机器阅读理解模型. 通过在不同位置使用不同方法, 对文本信息进行多种层次的获取. 使用膨胀卷积网络捕捉文本的全局信息, 采用双向注意力机制和自注意力机制融合文本和问题之间的交互信息, 通过指针网络预测答案及其对应的支撑句. 该模型在CAIL2019和CAIL2020阅读理解数据集上训练的联合F1值分别达到50.09%和58.44%, 相比于其他基线模型取得了明显的性能提升.
Abstract:In previous machine reading comprehension models, there were some problems, such as single-text feature extraction and incomplete interactive information between text and questions, which led to insufficient text understanding. This study proposes a machine reading understanding model with multi-level information fusion, which can obtain text information at multiple levels by using different methods in different locations. The model uses the dilated convolutional network to capture the global information of the text. Bi-directional attention mechanism and self-attention mechanism are used to fuse the interactive information between text and questions. Finally, the answer and its corresponding supporting sentence are predicted through the pointer network. The joint F1 values of the model trained on the CAIL2019 and CAIL2020 reading comprehension datasets reach 50.09% and 58.44% respectively, which achieves significant performance improvement compared with other baseline models.
文章编号:     中图分类号:    文献标志码:
基金项目:陕西省重点研发计划(2019ZDLGY17-08); 陕西省特支计划科技创新领军人才项目(TZ0366)
引用文本:
朱海飞,段宗涛,王全伟,曹建荣,席铁钧.基于多层次信息融合的多跳机器阅读理解.计算机系统应用,2024,33(7):239-247
ZHU Hai-Fei,DUAN Zong-Tao,WANG Quan-Wei,CAO Jian-Rong,XI Tie-Jun.Multi-hop Machine Reading Comprehension Based on Multi-level Information Fusion.COMPUTER SYSTEMS APPLICATIONS,2024,33(7):239-247