Chinese Short Text Summarization Based on NN-Attention
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    The Bidirectional RNN (BRNN) was adopted in previous Attention models. The BRNN is effective for context information, but it is unable to extract high dimensional text features. Therefore, the CNN was introduced. The Attention model based on matrix transformation cannot characterize the features extracted by the CNN, a fully-connected neural network is used to improve the Attention model, and the NN-Attention is proposed. The recurrent neural network adopted was GRU, so as to speed up model training, the CSTSD dataset was used and TensorFlow was utilized for model construction. The experimental results show that the proposed model is able to realize automatic generation of text abstracts well in the CSTSD dataset.

    Reference
    Related
    Cited by
Get Citation

何正方,梁宇.基于NN-Attention的中文短文本摘要.计算机系统应用,2020,29(7):166-172

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:December 01,2019
  • Revised:December 23,2019
  • Adopted:
  • Online: July 04,2020
  • Published: July 15,2020
Article QR Code
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063