###
计算机系统应用英文版:2020,29(7):166-172
本文二维码信息
码上扫一扫!
基于NN-Attention的中文短文本摘要
(云南大学 软件学院, 昆明 650500)
Chinese Short Text Summarization Based on NN-Attention
(School of Software, Yunnan University, Kunming 650500, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 1077次   下载 1483
Received:December 01, 2019    Revised:December 23, 2019
中文摘要: 在以往的Attention模型中, 只采用了Bidirectional-RNN, BRNN对上下文信息是有效的, 但是无法提取文本的高维特征, 所以引入了CNN. 因为基于矩阵变换的Attention模型无法对CNN抽取的特征进行表征, 所以采用全连接神经网络对Attention模型进行改进, 提出了NN-Attention. 为了加速模型的训练, 采用的循环神经网络为GRU. 实验采用CSTSD数据集, 并用TensorFlow完成模型的构建. 实验结果表明, 该模型在CSTSD数据集中可以较好地实现文本摘要的自动生成.
中文关键词: 中文  短文本  摘要  GRU  CNN  NN-Attention
Abstract:The Bidirectional RNN (BRNN) was adopted in previous Attention models. The BRNN is effective for context information, but it is unable to extract high dimensional text features. Therefore, the CNN was introduced. The Attention model based on matrix transformation cannot characterize the features extracted by the CNN, a fully-connected neural network is used to improve the Attention model, and the NN-Attention is proposed. The recurrent neural network adopted was GRU, so as to speed up model training, the CSTSD dataset was used and TensorFlow was utilized for model construction. The experimental results show that the proposed model is able to realize automatic generation of text abstracts well in the CSTSD dataset.
keywords: Chinese  short text  summarization  GRU  CNN  NN-Attention
文章编号:     中图分类号:    文献标志码:
基金项目:
引用文本:
何正方,梁宇.基于NN-Attention的中文短文本摘要.计算机系统应用,2020,29(7):166-172
HE Zheng-Fang,LIANG Yu.Chinese Short Text Summarization Based on NN-Attention.COMPUTER SYSTEMS APPLICATIONS,2020,29(7):166-172