Back to Main Conference 2026
LREC 2026main

Tracing How Annotators Think: Augmenting Preference Judgments with Reading Processes

Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)

DOI:10.63317/3nrqhnhixp4j

Abstract

We propose an annotation approach that captures not only labels but also the reading process underlying annotators’ decisions, e.g., what parts of the text they focus on, re-read or skim. Using this approach, we conduct a case study on the preference annotation task and create a dataset PreferRead that contains fine-grained annotator reading behaviors obtained from mouse tracking. PreferRead enables detailed analysis of how annotators navigate between a prompt and two candidate responses before selecting their preference. We find that annotators re-read a response in roughly half of all trials, most often revisiting the option they ultimately choose, and rarely revisit the prompt. Reading behaviors are also significantly related to annotation outcomes: re-reading is associated with higher inter-annotator agreement, whereas long reading paths and times are associated with lower agreement. These results demonstrate that reading processes provide a complementary cognitive dimension for understanding annotator reliability, decision-making and disagreement in complex, subjective NLP tasks.

Details

Paper ID
lrec2026-main-510
Pages
pp. 6428-6438
BibKey
langis-etal-2026-tracing
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
978-2-493814-49-4
Conference
The Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Location
Palma, Mallorca, Spain
Date
11 May 2026 16 May 2026

Authors

  • KL

    Karin Johanna Denton de Langis

  • WW

    William Walker

  • KL

    Khanh Chi Le

  • DK

    Dongyeop Kang

Links