Few-shot Relational Triple Extraction Based on Module Transfer and Semantic Similarity Inference
Author:
  • Article
  • | |
  • Metrics
  • |
  • Reference [20]
  • |
  • Related [17]
  • | | |
  • Comments
    Abstract:

    Existing few-shot relational triple extraction methods often struggle with handling multiple triples in a single sentence and fail to consider the semantic similarity between the support set and the query set. To address these issues, this study proposes a few-shot relational triple extraction method based on module transfer and semantic similarity inference. The method uses a mechanism that constantly transfers among three modules, namely relation extraction, entity recognition, and triple discrimination, to extract multiple relational triples efficiently from a query instance. In the relation extraction module, BiLSTM and a self-attention mechanism are integrated to better capture the sequence information of the emergency plan text. In addition, a method based on semantic similarity inference is designed to recognize emergency organizational entities in sentences. Finally, extensive experiments are conducted on ERPs+, a dataset for emergency response plans. Experimental results show that the proposed model is more suitable for relational triple extraction in the field of emergency plans compared with other baseline models.

    Reference
    [1] Ni WJ, Shen QL, Liu T, et al. Generating textual emergency plans for unconventional emergencies—A natural language processing approach. Safety Science, 2023, 160: 106047.
    [2] Wang YC, Yu BW, Zhang YY, et al. TPLinker: Single-stage joint extraction of entities and relations through token pair linking. Proceedings of the 28th International Conference on Computational Linguistics. Barcelona: International Committee on Computational Linguistics, 2020. 1572–1582.
    [3] Wei ZP, Su JL, Wang Y, et al. A novel cascade binary tagging framework for relational triple extraction. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2020. 1476–1488.
    [4] Zheng HY, Wen R, Chen X, et al. PRGC: Potential relation and global correspondence based joint relational triple extraction. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing. Association for Computational Linguistics, 2021. 6225–6235.
    [5] Mintz M, Bills S, Snow R, et al. Distant supervision for relation extraction without labeled data. Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP. Suntec: Association for Computational Linguistics, 2009. 1003–1011.
    [6] 赵凯琳, 靳小龙, 王元卓. 小样本学习研究综述. 软件学报, 2021, 32(2): 349–369.
    [7] Yu HY, Zhang NY, Deng SM, et al. Bridging text and knowledge with multi-prototype embedding for few-shot relational triple extraction. Proceedings of the 28th International Conference on Computational Linguistics. Barcelona: International Committee on Computational Linguistics, 2020. 6399–6410.
    [8] Fei JB, Zeng WX, Zhao X, et al. Few-shot relational triple extraction with perspective transfer network. Proceedings of the 31st ACM International Conference on Information & Knowledge Management. Atlanta: ACM, 2022. 488–498.
    [9] 鄂海红, 张文静, 肖思琪, 等. 深度学习实体关系抽取研究综述. 软件学报, 2019, 30(6): 1793–1818.
    [10] Cai R, Zhang XD, Wang HF. Bidirectional recurrent convolutional neural network for relation classification. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin: Association for Computational Linguistics, 2016. 756–765.
    [11] He XL, Song H, Cheng D, et al. Few-shot relational triple extraction with nearest neighbor matching. Proceedings of the 2022 International Conference on Computer Graphics, Artificial Intelligence, and Data Processing. Harbin: SPIE, 2022. 262–266.
    [12] Cong X, Sheng JW, Cui SY, et al. Relation-guided few-shot relational triple extraction. Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval. Madrid: ACM, 2022. 2206–2213.
    [13] Han X, Zhu H, Yu PF, et al. FewRel: A large-scale supervised few-shot relation classification dataset with state-of-the-art evaluation. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 4803–4809.
    [14] Wang YQ, Yao QM, Kwok JT, et al. Generalizing from a few examples: A survey on few-shot learning. ACM Computing Surveys (CSUR), 2020, 53(3): 63.
    [15] Zheng SC, Xu JM, Zhou P, et al. A neural network framework for relation extraction: Learning entity semantic and relation pattern. Knowledge-based Systems, 2016, 114: 12–23.
    [16] Devlin J, Chang MW, Lee K, et al. BERT: Pre-training of deep bidirectional Transformers for language understanding. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis: Association for Computational Linguistics, 2019. 4171–4186.
    [17] Ayetiran EF. Attention-based aspect sentiment classification using enhanced learning through CNN-BiLSTM networks. Knowledge-based Systems, 2022, 252: 109409.
    [18] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need. Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach: Curran Associates Inc. , 2017. 6000–6010.
    [19] Jiang S, Zhu JZ, He LH. Few-shot relational triple extraction based on evaluation of token-level semantic similarity. Proceedings of the 32nd International Conference on Artificial Neural Networks. Heraklion: Springer, 2023: 232–242.
    [20] Yang Y, Katiyar A. Simple and effective few-shot named entity recognition with structured nearest neighbor learning. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2020. 6365–6375.
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

刘彤,刘炳霄,倪维健.基于模块转移和语义相似性推断的小样本关系三元组抽取.计算机系统应用,2025,34(1):190-199

Copy
Share
Article Metrics
  • Abstract:107
  • PDF: 403
  • HTML: 108
  • Cited by: 0
History
  • Received:June 01,2024
  • Revised:June 26,2024
  • Online: November 15,2024
Article QR Code
You are the first990333Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063