Back to Main Conference 2026
LREC 2026main

K-MIND: Korean Multimodal INteraction Data for Dyadic Conversation Analysis

Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)

DOI:10.63317/3mz4q73vpu6q

Abstract

We present the Korean Multimodal INteraction Data (K-MIND), a large-scale corpus of dyadic Korean dialogue that is designed to capture the multimodal richness of social interaction. The dataset includes 292 participants and 200 sets (935 clips) spanning 115 hours and 30 minutes, all aligned across verbal, paraverbal, and nonverbal modalities such as transcripts, acoustic features, and visual signals. For these modalities, we propose a comprehensive annotation scheme that enables nuanced yet consistent labeling of complex communicative behaviors, balancing theoretical soundness with practical feasibility. We further report analysis results of the corpus, including label distributions, within- and cross-layer analyses. These analyses illuminate the key properties of dyadic K-MIND and demonstrate its utility for advancing research in human–computer interaction as well as in interdisciplinary domains. To ensure continuous refinement, the corpus and framework are being validated in complementary studies and have been extended to triadic interactions (K-MIND Triadic) that model group dynamics, which will be included in upcoming releases.

Details

Paper ID
lrec2026-main-715
Pages
pp. 9105-9117
BibKey
yang-etal-2026-mind
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
978-2-493814-49-4
Conference
The Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Location
Palma, Mallorca, Spain
Date
11 May 2026 16 May 2026

Authors

  • JY

    Jae Hee Yang

  • YS

    Yuha Shin

  • SS

    Saim Shin

  • JK

    Je Woo Kim

  • JJ

    Jin Yea Jang

Links