Back to Main Conference 2026
LREC 2026main

MEUR: A Benchmark for Evaluating Vision-Language Models on Multimodal Event Understanding and Reasoning

Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)

DOI:10.63317/4ftadqtyt374

Abstract

Event understanding and reasoning play critical roles in thoroughly evaluating the capabilities of Vision-Language Models (VLMs); however, existing Visual Question Answering (VQA) datasets predominantly focus on entity-centric questions, while event- or action-related questions are limited in scale and suffer from significant shortcut issues. We introduce MEUR, the first Multimodal Event Understanding and Reasoning dataset consisting of 1,200 images and 4,217 questions, necessitating VLMs with a diverse range of multimodal understanding and reasoning capabilities to answer, ranging from basic event recognition to more complex tasks such as counting and comparison. To streamline the annotation process, we propose a novel semi-automated pipeline that combines advanced VLMs with human annotators, achieving high quality and efficiency. We conduct extensive experiments on state-of-the-art non-thinking and thinking VLMs to demonstrate their capabilities and limitations in multimodal event understanding and reasoning. Furthermore, we provide a detailed error analysis that points out promising directions for future research.

Details

Paper ID
lrec2026-main-134
Pages
pp. 1696-1709
BibKey
wang-etal-2026-meur
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
978-2-493814-49-4
Conference
The Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Location
Palma, Mallorca, Spain
Date
11 May 2026 16 May 2026

Authors

  • ZW

    Zimu Wang

  • YW

    Yuqi Wang

  • TC

    Tong Chen

  • CZ

    Changyu Zeng

  • HN

    Hongbin Na

  • NH

    Nijia Han

  • FX

    Fuyu Xing

  • QC

    Qi Chen

  • QW

    Qiufeng Wang

  • AN

    Anh Nguyen

  • SW

    Shuihua Wang

  • LC

    Ling Chen

  • JS

    Jionglong Su

  • HZ

    Haiyang Zhang

  • WW

    Wei Wang

Links