Link Prediction Model Integrating DeepE and Contrastive Learning
Author:
  • Article
  • | |
  • Metrics
  • |
  • Reference [30]
  • |
  • Related
  • |
  • Cited by
  • | |
  • Comments
    Abstract:

    Most of the existing knowledge graph link prediction methods focus only on the semantic relationships between a head entity h, a relationship r, and a tail entity t in a single triad in learning semantic information. They do not consider the links between related entities and entity relationships in different triads. To address this problem, this study proposes the DeepE_CL model. Firstly, the study uses the DeepE model to learn the semantic information of related triads and entities with the same entity relationship pairs or entity relationship pairs with the same entities. Secondly, the extracted semantic information of the related triads is used to calculate the corresponding scoring function and cross-entropy loss, and the extracted semantic information of entities with the same entity relationship pairs or entity relationship pairs with the same entities is optimized through the comparative learning model, so as to predict the missing information of the related triads. This paper validates the proposed method through four common datasets and compares the proposed method with other baseline models by applying four evaluation indicators, including MR, MRR, Hit@1, and Hit@10. The experimental results show that the DeepE_CL model achieves the best results in all indicators. To further validate the usefulness of the model, this study also applies the model to a real traditional Chinese medicine (TCM) dataset, and the experimental results show that compared with the DeepE model, the DeepE_CL model reduces the MR indicators by 18, and improves the MRR, Hit@1 indicators by 0.8%, 1.1%, and the Hit@10 indicators remain unchanged. The experiments demonstrate that the DeepE_CL model, introducing a comparative learning model, is very effective in improving the performance of knowledge graph link prediction.

    Reference
    [1] Hilman D, Şerban O. A unified link prediction architecture applied on a novel heterogenous knowledge base. Knowledge-based Systems, 2022, 241: 108228.
    [2] 封皓君, 段立, 张碧莹. 面向知识图谱的知识推理综述. 计算机系统应用, 2021, 30(10): 21–30.
    [3] 张栩翔, 马华. 知识图谱与图嵌入在个性化教育中的应用综述. 计算机系统应用, 2022, 31(3): 48–55.
    [4] Zhu DH, Shen S, Huang SJ, et al. DeepE: A deep neural network for knowledge graph embedding. arXiv:2211.04620, 2022.
    [5] Bordes A, Usunier N, Garcia-Durán A, et al. Translating embeddings for modeling multi-relational data. Proceedings of the 26th International Conference on Neural Information Processing Systems. Lake Tahoe: Curran Associates Inc., 2013. 2787–2795.
    [6] Wang Z, Zhang JW, Feng JL, et al. Knowledge graph embedding by translating on hyperplanes. Proceedings of the 28th AAAI Conference on Artificial Intelligence. Québec City: AAAI, 2014. 1112–1119.
    [7] Sun ZQ, Deng ZH, Nie JY, et al. RotatE: Knowledge graph embedding by relational rotation in complex space. Proceedings of the 7th International Conference on Learning Representations. New Orleans: OpenReview.net, 2019.
    [8] Sadeghian A, Armandpour M, Colas A, et al. ChronoR: Rotation based temporal knowledge graph embedding. Proceedings of the 35th AAAI Conference on Artificial Intelligence. Washington: AAAI, 2014. 6471–6479.
    [9] Dasgupta SS, Ray SN, Talukdar P. HyTE: Hyperplane-based temporally aware knowledge graph embedding. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 2001–2011.
    [10] Riedel S, Yao LM, McCallum A, et al. Relation extraction with matrix factorization and universal schemas. Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Atlanta: Association for Computational Linguistics, 2013. 74–84.
    [11] Yang BS, Yih WT, He XD, et al. Embedding entities and relations for learning and inference in knowledge bases. Proceedings of the 3rd International Conference on Learning Representations. San Diego: OpenReview.net, 2015.
    [12] Nickel M, Rosasco L, Poggio T. Holographic embeddings of knowledge graphs. Proceedings of the 30th AAAI Conference on Artificial Intelligence. Phoenix: AAAI, 2016. 1955–1961.
    [13] Trouillon T, Welbl J, Riedel S, et al. Complex embeddings for simple link prediction. Proceedings of the 33rd International Conference on Machine Learning. New York City: PMLR, 2016. 2071–2080.
    [14] Dettmers T, Minervini P, Stenetorp P, et al. Convolutional 2D knowledge graph embeddings. Proceedings of the 32nd AAAI Conference on Artificial Intelligence. New Orleans: AAAI, 2018. 1811–1818.
    [15] Vashishth S, Sanyal S, Nitin V, et al. InteractE: Improving convolution-based knowledge graph embeddings by increasing feature interactions. Proceedings of the 34th AAAI Conference on Artificial Intelligence. New York: AAAI, 2020. 3009–3016.
    [16] Zhou ZH, Wang C, Feng Y, et al. JointE: Jointly utilizing 1D and 2D convolution for knowledge graph embedding. Knowledge-based Systems, 2022, 240: 108100.
    [17] Wang XT, He QY, Liang JQ, et al. Language models as knowledge embeddings. Proceedings of the 31st International Joint Conference on Artificial Intelligence. Vienna: IJCAI, 2022. 2291–2297.
    [18] Li D, Zhu BQ, Yang S, et al. Multi-task pre-training language model for semantic network completion. ACM Transactions on Asian and Low-resource Language Information Processing, 2023, 22(11): 250.
    [19] Chen T, Kornblith S, Norouzi M, et al. A simple framework for contrastive learning of visual representations. Proceedings of the 37th International Conference on Machine Learning. Baltimore: PMLR, 2020. 1597–1607.
    [20] Kalantidis Y, Sariyildiz MB, Pion N, et al. Hard negative mixing for contrastive learning. Proceedings of the 34th International Conference on Neural Information Processing Systems. Vancouver: Curran Associates Inc., 2020. 1829.
    [21] Yang ZH, Cheng Y, Liu Y, et al. Reducing word omission errors in neural machine translation: A contrastive learning approach. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019. 6191–6196.
    [22] Kachuee M, Yuan H, Kim YB, et al. Self-supervised contrastive learning for efficient user satisfaction prediction in conversational agents. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, 2020. 4053–4064.
    [23] Khosla P, Teterwak P, Wang C, et al. Supervised contrastive learning. Proceedings of the 34th International Conference on Neural Information Processing Systems. Vancouver: Curran Associates Inc., 2020. 1567.
    [24] Luo ZP, Xu WT, Liu WQ, et al. KGE-CL: Contrastive learning of tensor decomposition based knowledge graph embeddings. Proceedings of the 29th International Conference on Computational Linguistics. Gyeongju: Association for Computational Linguistics, 2021. 2598–2607.
    [25] Balažević I, Allen C, Hospedales TM. Hypernetwork knowledge graph embeddings. Proceedings of the 28th International Conference on Artificial Neural Networks and Machine Learning. Munich: Spring, 2019. 553–565.
    [26] Ren FL, Li JC, Zhang HH, et al. Knowledge graph embedding with atrous convolution and residual learning. Proceedings of the 28th International Conference on Computational Linguistics. Barcelona: Association for Computational Linguistics, 2020. 1532–1543.
    [27] Ravishankar S, Chandrahas, Talukdar PP. Revisiting simple neural networks for learning representations of knowledge graphs. Proceedings of the 6th Workshop on Automated Knowledge Base Construction. Long Beach, 2017.
    [28] Tan ZX, Chen ZL, Feng SB, et al. KRACL: Contrastive learning with graph context modeling for sparse knowledge graph completion. Proceedings of the 2023 ACM Web Conference. Austin: ACM, 2023. 2548–2559.
    [29] 周忠眉, 林宝德, 肖青. 古代方剂与新药方剂高频药组配情况分析. 漳州师范学院学报(自然科学版), 2004, 17(1): 19–21.
    [30] 翁慧敏. 链路预测方法在药物重定位中的应用研究. 福建电脑, 2024, 40(3): 39–43.
    Related
    Cited by
Get Citation

翁慧敏,郭躬德,林世水.融合DeepE和对比学习的链路预测模型.计算机系统应用,2025,34(2):206-215

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:July 15,2024
  • Revised:August 13,2024
  • Online: December 16,2024
Article QR Code
You are the first990935Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063