Abstract:As an important research program in the field of information extraction, relation extraction is mainly aimed at extracting semantic relations between labeled entity pairs in sentences, which plays an important role in sentence semantic understanding and knowledge base construction. The existing extraction methods fail to make full use of the word position information and the interaction information between entities, which leads to the loss of effective features in relation extraction. To solve this problem, this study proposes a relation extraction method BPI-BERT based on the interaction information between position encoding and entities. The novel position coding is integrated into the word vector generated by the BERT pre-trained language model, and the entity and sentence vectors are obtained through the average pooling technology. The Hadamard product is used to construct the entity interaction information. Finally, the entity vector, sentence vector, and interaction information vector are stitched together to obtain the relation vector which is then input to the Softmax classifier for relation classification. The experimental results show that the precision and F1 of BPI-BERT are significantly improved compared with those of the existing methods, and thus the effectiveness of BPI-BERT is proved.