Back to Main Conference 2024
LREC-COLING 2024main

MoPE: Mixture of Prefix Experts for Zero-Shot Dialogue State Tracking

Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

DOI:10.63317/5a5virig9go3

Abstract

Zero-shot dialogue state tracking (DST) transfers knowledge to unseen domains, reducing the cost of annotating new datasets. Previous zero-shot DST models mainly suffer from domain transferring and partial prediction problems. To address these challenges, we propose Mixture of Prefix Experts (MoPE) to establish connections between similar slots in different domains, which strengthens the model transfer performance in unseen domains. Empirical results demonstrate that MoPE-DST achieves the joint goal accuracy of 57.13% on MultiWOZ2.1 and 55.4.

Details

Paper ID
lrec2024-main-1012
Pages
pp. 11582-11592
BibKey
tang-etal-2024-mope
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
2522-2686
ISBN
979-10-95546-34-4
Conference
Joint International Conference on Computational Linguistics, Language Resources and Evaluation
Location
Turin, Italy
Date
20 May 2024 25 May 2024

Authors

  • TT

    Tianwen Tang

  • TZ

    Tong Zhu

  • HL

    Haodong Liu

  • YB

    Yin Bai

  • JC

    Jia Cheng

  • WC

    Wenliang Chen

Links