Chinese Entity Relation Extraction Based on Multi-Feature BERT Model
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Relation extraction is a core technology to construct a knowledge graph. The complexity of Chinese grammar and sentence structure as well as the limited feature extraction and poor semantic representation of the existing neural network model restrict the relation extraction of Chinese entities. A relation extraction algorithm based on a BERT pretraining model is proposed in this study. It preprocesses the corpus by extracting keywords, entity pairs and entity type and integrating them to strengthen the semantic learning ability of the BERT model, greatly reducing the loss of semantic features. Results are obtained by a Softmax classifier, which show that this model is better than the existing neural network model. In particular, the model reaches a F1-score of 97.50% on the Chinese data set.

    Reference
    Related
    Cited by
Get Citation

谢腾,杨俊安,刘辉.融合多特征BERT模型的中文实体关系抽取.计算机系统应用,2021,30(5):253-261

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:September 10,2020
  • Revised:October 09,2020
  • Adopted:
  • Online: May 06,2021
  • Published:
Article QR Code
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063