Abstract:Continual relation extraction aims to train models to learn new relations from evolving data streams while maintaining accurate classification of previously learned relations. However, due to the catastrophic forgetting problem of neural networks, the model’s ability to recognize old relations tends to decrease drastically after being trained on new relations. To mitigate the impact of catastrophic forgetting on model performance, this study proposes a continual relation extraction method based on contrastive learning and focal loss. First, the model is trained on a concatenated set of the original training set and its augmented samples to learn a new task. Second, from the training set, memory samples are selected and stored for each new relation. Then, instances from the activation set are contrasted with all known relation prototypes to learn the old and the new relations. Finally, memory reconsolidation is performed using the relation prototypes and focal loss is introduced to improve the model’s distinction between similar relations. Experiments are conducted on the TACRED dataset, and the results show that the method proposed can further alleviate catastrophic forgetting and improve the model’s classification ability.