Abstract:Relation classification is an important subtask in the field of Natural Language Processing (NLP), which provides technical support for the construction of knowledge map, question answer systems, and information retrieval. Compared with traditional relational classification methods, deep learning model-based methods with attention have achieved better performance in various relation classification tasks. Most of previous models use one-layer attention, which cause single representation of the feature. Therefore, on the basis of the existing works, the study introduces a multi-head attention, which aims to enable the model to obtain more information about sentence from different representation subspaces and improve the model's feature expression ability. Otherwise, based on the existing word embedding and position embedding as network input, we introduce dependency parsing feature and relative core predicate dependency feature to the model. The dependency parsing features include the dependency value and the location of the dependent parent node position for the current word. The experimental results on the SemEval-2010 relation classification task show that the proposed method outperforms most of the existing methods.