Off the Hamster Wheel: Rethinking Dialogue Research through a Meta-Analysis of the ACL Anthology 2024
Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Abstract
In this paper, we take a meta-review approach to investigate how conversation is currently studied in the field by analysing papers from the ACL Anthology 2024. We retrieved 407 papers, which represents about 6.1% of the papers published in the selected venues, and manually reviewed them to determine the conversational task addressed, the corpora used, and the evaluation methods employed. Our analysis leads to several observations. First, dialogue systems represent about half of the papers of the ACL Anthology 2024 while more formal and analytical approaches cover only 12%. Second, many papers provide lacking corpus descriptions, which shows a detachment from the data which becomes a simple tool instead of one of the pillars NLP/CL applications should be based on. Third, the evaluation methods, in particular when it comes to dialogue systems, often do not assess the interactional aspects of these systems or rely on assumptions not backed up from evidence of the dialogue research community. We argue that the field would benefit from a renewed focus on analysis and formal representation of conversation, a richer evaluation culture that includes interactional quality, and more systematic practices regarding the data presentation in papers.