Abstract:The traditional generative model ignores the important clues provided by key words in the process of abstract generation, which leads to the loss of key word information, and the generated abstract cannot agree with the original text well. In this study, an abstract generation method is proposed, which takes the pointer-generator network as the framework and integrates BERT pretraining model and key word information.?Firstly, the TextRank algorithm and the sequence model based on the attention mechanism are used to extract key words from the original text, and thus the generated key words can contain more information about the original text.?Secondly, the key word attention is added to the attention mechanism of the pointer-generator network to guide the generation of an abstract.?In addition, we use the double-pointer copy mechanism to replace the copy mechanism of the pointer-generator network and thus improve the coverage of the copy mechanism. The results on LCSTS data sets reveal that the designed model can contain more key information and improve the accuracy and readability of generated abstracts.