Back to Main Conference 2024
LREC-COLING 2024main

Prompt Tuning for Few-shot Relation Extraction via Modeling Global and Local Graphs

Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

DOI:10.63317/3ehsmnvdxvv3

Abstract

Recently, prompt-tuning has achieved very significant results for few-shot tasks. The core idea of prompt-tuning is to insert prompt templates into the input, thus converting the classification task into a masked language modeling problem. However, for few-shot relation extraction tasks, how to mine more information from limited resources becomes particularly important. In this paper, we first construct a global relation graph based on label consistency to optimize the feature representation of samples between different relations. Then the global relation graph is further divided to form a local relation subgraph for each relation type to optimize the feature representation of samples within the same relation. This fully uses the limited supervised information and improves the tuning efficiency. In addition, the existence of rich semantic knowledge in relation labels cannot be ignored. For this reason, this paper incorporates the knowledge in relation labels into prompt-tuning. Specifically, the potential knowledge implicit in relation labels is injected into constructing learnable prompt templates. In this paper, we conduct extensive experiments on four datasets under low-resource settings, showing that this method achieves significant results.

Details

Paper ID
lrec2024-main-1158
Pages
pp. 13233-13242
BibKey
zhang-etal-2024-prompt-tuning
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
2522-2686
ISBN
979-10-95546-34-4
Conference
Joint International Conference on Computational Linguistics, Language Resources and Evaluation
Location
Turin, Italy
Date
20 May 2024 25 May 2024

Authors

  • ZZ

    Zirui Zhang

  • YY

    Yiyu Yang

  • BC

    Benhui Chen

Links