Ensemble Training Model Integrating Knowledge
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In the field of Internet-based medical treatment, AI-based triage represents a key link, which allocates patients to departments according to conditions, disease attributes, medications, etc. We can adopt the BERT with a deep bi-directional Transformer structure for language model pre-training to enhance the word semantics; however, the text description of patients’ conditions offers sparse information, which is not conducive to the full learning of characteristics by BERT. This paper presents DNNBERT, a joint training model integrating knowledge. Combining the advantages of Deep Neural Network (DNN) and the Transformer model, DNNBERT can learn more semantics from text. The experimental results prove that the computing time of DNNBERT is 1.7 times shorter than that of BERT-large; the accuracy rate of DNNBERT is 0.12 higher than the F1 value of ALBERT and 0.17 higher than that of TextCNN. This paper will provide a new idea for sparse feature learning and the applications of deep Transformer-based models to production.

    Reference
    Related
    Cited by
Get Citation

王永鹏,周晓磊,马慧敏,曹吉龙.联合知识的融合训练模型.计算机系统应用,2021,30(7):50-56

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:November 04,2020
  • Revised:December 12,2020
  • Adopted:
  • Online: July 02,2021
  • Published:
Article QR Code
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063