The distantly-supervised relation extraction method aims to efficiently construct a large-scale supervised corpus and apply it to the task of relation extraction. However, constructing the corpus by distant supervision brings two major problems: noise labels and long tail distribution. In this study, a novel distantly-supervised relation extraction model is proposed. Unlike the previous pipeline-based training, an external knowledge enhancement module is added in addition to the sentence encoder module. By preprocessing and coding the existing entity types and relations in the knowledge base, the external knowledge that the sentence package text does not have is provided for the model. It is conducive to alleviating the problem of insufficient information caused by insufficient long tail relation instances in the data set and improving the discrimination ability of the model to noise instances. Through a large number of experiments on the benchmark data sets NYT and GDS, the AUC value has increased by 0.9% and 5.7% respectively, compared with the mainstream optimal model, which proves the effectiveness of the external knowledge enhancement module.