Abstract:At present, traditional deep learning-based relation extraction methods are unable to extract relations in complex contexts and fail to consider the impact of non-target relations in a context on relation extraction. In response, this paper proposes a control input long short-term memory (CI-LSTM) network that adds an input control unit composed of an attention mechanism and a control gate valve unit to the traditional LSTM network. The control gate valve unit can perform focused learning on key positions according to the control vector, and the attention mechanism calculates the different features of the inputs of a single LSTM network. After experiments, this paper finally chooses to use syntactic dependency to generate control vectors and build a relation extraction model. An experiment is then conducted on the SemEval-2010 Task8 relation data set and the samples in the data set with complex contexts. The results show that compared with the traditional relation extraction method, the CI-LSTM network proposed in this paper achieves further improvement in accuracy and better performance in complex contexts.