本文已被:浏览 1314次 下载 1888次
Received:December 01, 2019 Revised:December 23, 2019
Received:December 01, 2019 Revised:December 23, 2019
中文摘要: 在以往的Attention模型中, 只采用了Bidirectional-RNN, BRNN对上下文信息是有效的, 但是无法提取文本的高维特征, 所以引入了CNN. 因为基于矩阵变换的Attention模型无法对CNN抽取的特征进行表征, 所以采用全连接神经网络对Attention模型进行改进, 提出了NN-Attention. 为了加速模型的训练, 采用的循环神经网络为GRU. 实验采用CSTSD数据集, 并用TensorFlow完成模型的构建. 实验结果表明, 该模型在CSTSD数据集中可以较好地实现文本摘要的自动生成.
Abstract:The Bidirectional RNN (BRNN) was adopted in previous Attention models. The BRNN is effective for context information, but it is unable to extract high dimensional text features. Therefore, the CNN was introduced. The Attention model based on matrix transformation cannot characterize the features extracted by the CNN, a fully-connected neural network is used to improve the Attention model, and the NN-Attention is proposed. The recurrent neural network adopted was GRU, so as to speed up model training, the CSTSD dataset was used and TensorFlow was utilized for model construction. The experimental results show that the proposed model is able to realize automatic generation of text abstracts well in the CSTSD dataset.
keywords: Chinese short text summarization GRU CNN NN-Attention
文章编号: 中图分类号: 文献标志码:
基金项目:
Author Name | Affiliation | |
HE Zheng-Fang | School of Software, Yunnan University, Kunming 650500, China | hfrommane@qq.com |
LIANG Yu | School of Software, Yunnan University, Kunming 650500, China |
Author Name | Affiliation | |
HE Zheng-Fang | School of Software, Yunnan University, Kunming 650500, China | hfrommane@qq.com |
LIANG Yu | School of Software, Yunnan University, Kunming 650500, China |
引用文本:
何正方,梁宇.基于NN-Attention的中文短文本摘要.计算机系统应用,2020,29(7):166-172
HE Zheng-Fang,LIANG Yu.Chinese Short Text Summarization Based on NN-Attention.COMPUTER SYSTEMS APPLICATIONS,2020,29(7):166-172
何正方,梁宇.基于NN-Attention的中文短文本摘要.计算机系统应用,2020,29(7):166-172
HE Zheng-Fang,LIANG Yu.Chinese Short Text Summarization Based on NN-Attention.COMPUTER SYSTEMS APPLICATIONS,2020,29(7):166-172