Abstract:The Bidirectional RNN (BRNN) was adopted in previous Attention models. The BRNN is effective for context information, but it is unable to extract high dimensional text features. Therefore, the CNN was introduced. The Attention model based on matrix transformation cannot characterize the features extracted by the CNN, a fully-connected neural network is used to improve the Attention model, and the NN-Attention is proposed. The recurrent neural network adopted was GRU, so as to speed up model training, the CSTSD dataset was used and TensorFlow was utilized for model construction. The experimental results show that the proposed model is able to realize automatic generation of text abstracts well in the CSTSD dataset.