Back to Main Conference 2022
LREC 2022main

Interactive Evaluation of Dialog Track at DSTC9

Proceedings of the Thirteenth International Conference on Language Resources and Evaluation (LREC 2022)

DOI:10.63317/4aut3k2wm2i9

Abstract

The ultimate goal of dialog research is to develop systems that can be effectively used in interactive settings by real users. To this end, we introduced the Interactive Evaluation of Dialog Track at the 9th Dialog System Technology Challenge. This track consisted of two sub-tasks. The first sub-task involved building knowledge-grounded response generation models. The second sub-task aimed to extend dialog models beyond static datasets by assessing them in an interactive setting with real users. Our track challenges participants to develop strong response generation models and explore strategies that extend them to back-and-forth interactions with real users. The progression from static corpora to interactive evaluation introduces unique challenges and facilitates a more thorough assessment of open-domain dialog systems. This paper provides an overview of the track, including the methodology and results. Furthermore, it provides insights into how to best evaluate open-domain dialog models.

Details

Paper ID
lrec2022-main-616
Pages
pp. 5731-5738
BibKey
mehri-etal-2022-interactive
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
79-10-95546-38-2
Conference
Thirteenth Language Resources and Evaluation Conference
Location
Marseille, France
Date
20 June 2022 25 June 2022

Authors

  • SM

    Shikib Mehri

  • YF

    Yulan Feng

  • CG

    Carla Gordon

  • SA

    Seyed Hossein Alavi

  • DT

    David Traum

  • ME

    Maxine Eskenazi

Links