本文已被:浏览 997次 下载 2850次
Received:July 16, 2020 Revised:August 13, 2020
Received:July 16, 2020 Revised:August 13, 2020
中文摘要: 实体关系抽取是信息抽取的关键任务之一, 是一种包含实体抽取和关系抽取的级联任务. 传统的实体关系抽取方式是将实体与关系抽取任务分离的Pipeline方式, 忽略了两个任务的内在联系, 导致关系抽取的效果严重依赖实体抽取, 容易引起误差的累积. 为了规避这种问题, 我们提出一种端到端的实体关系联合抽取模型, 通过自注意力机制学习单词特征, 基于句法依存图蕴含的依赖信息构建依存约束, 然后将约束信息融入图注意力网络来实现实体与关系的抽取. 通过在公共数据集NYT上进行实验证明了我们工作的先进性和显著性, 我们的模型在保持高精度的情况下, 召回率有了显著的提升, 比以往工作中的方法具有更好的抽取性能.
Abstract:Entity relationship extraction is one of the key tasks of information extraction, which involves a multi-task cascade including entity extraction and relationship extraction. Traditional methods of entity relationship extraction follow a mode of Pipeline which separates entity extraction from relationship extraction, ignoring the internal connection between the two. As a result, the effect of relationship extraction depends heavily on entity extraction, and it is prone to error accumulation. To avoid this problem, we propose an end-to-end joint entity and relationship extraction model, which relies on the self-attention mechanism to learn word features, constructs dependency constraints based on dependency information contained in syntactic dependency graphs, and then integrates constraint information into a graph attention network for entity and relationship extraction. Experiments on the public data set NYT demonstrate the advance and significance of our model which has a high recall rate and better extraction performance than previous methods.
keywords: joint entity and relationship extraction dependency constraint graph attention network self-attention mechanism
文章编号: 中图分类号: 文献标志码:
基金项目:
引用文本:
任鹏程,于强,侯召祥.依存约束的图网络实体关系联合抽取.计算机系统应用,2021,30(3):24-32
REN Peng-Cheng,YU Qiang,HOU Zhao-Xiang.Graph Network with Dependency Constraints for Joint Entity and Relationship Extraction.COMPUTER SYSTEMS APPLICATIONS,2021,30(3):24-32
任鹏程,于强,侯召祥.依存约束的图网络实体关系联合抽取.计算机系统应用,2021,30(3):24-32
REN Peng-Cheng,YU Qiang,HOU Zhao-Xiang.Graph Network with Dependency Constraints for Joint Entity and Relationship Extraction.COMPUTER SYSTEMS APPLICATIONS,2021,30(3):24-32