Back to Main Conference 2024
LREC-COLING 2024main

PDAMeta: Meta-Learning Framework with Progressive Data Augmentation for Few-Shot Text Classification

Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

DOI:10.63317/2kpqdoqtfn2h

Abstract

Recently, we have witnessed the breakthroughs of meta-learning for few-shot learning scenario. Data augmentation is essential for meta-learning, particularly in situations where data is extremely scarce. However, existing text data augmentation methods can not ensure the diversity and quality of the generated data, which leads to sub-optimal performance. Inspired by the recent success of large language models (LLMs) which demonstrate improved language comprehension abilities, we propose a Meta-learning framework with Progressive Data Augmentation (PDAMeta) for few-shot text classification, which contains a two-stage data augmentation strategy. First, the prompt-based data augmentation enriches the diversity of the training instances from a global perspective. Second, the attention-based data augmentation further improves the data quality from a local perspective. Last, we propose a dual-stream contrastive meta-learning strategy to learn discriminative text representations from both original and augmented instances. Extensive experiments conducted on four public few-shot text classification datasets show that PDAMeta significantly outperforms several state-of-the-art models and shows better robustness.

Details

Paper ID
lrec2024-main-1109
Pages
pp. 12668-12678
BibKey
li-etal-2024-pdameta
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
2522-2686
ISBN
979-10-95546-34-4
Conference
Joint International Conference on Computational Linguistics, Language Resources and Evaluation
Location
Turin, Italy
Date
20 May 2024 25 May 2024

Authors

  • XL

    Xurui Li

  • KS

    Kaisong Song

  • TL

    Tianqianjin Lin

  • YK

    Yangyang Kang

  • FZ

    Fubang Zhao

  • CS

    Changlong Sun

  • XL

    Xiaozhong Liu

Links