Back to Main Conference 2022
LREC 2022main

Evaluating Retrieval for Multi-domain Scientific Publications

Proceedings of the Thirteenth International Conference on Language Resources and Evaluation (LREC 2022)

DOI:10.63317/5oxbjnecickk

Abstract

This paper provides an overview of the xDD/LAPPS Grid framework and provides results of evaluating the AskMe retrievalengine using the BEIR benchmark datasets. Our primary goal is to determine a solid baseline of performance to guide furtherdevelopment of our retrieval capabilities. Beyond this, we aim to dig deeper to determine when and why certain approachesperform well (or badly) on both in-domain and out-of-domain data, an issue that has to date received relatively little attention.

Details

Paper ID
lrec2022-main-487
Pages
pp. 4569-4576
BibKey
ide-etal-2022-evaluating
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
79-10-95546-38-2
Conference
Thirteenth Language Resources and Evaluation Conference
Location
Marseille, France
Date
20 June 2022 25 June 2022

Authors

  • NI

    Nancy Ide

  • KS

    Keith Suderman

  • JT

    Jingxuan Tu

  • MV

    Marc Verhagen

  • SP

    Shanan Peters

  • IR

    Ian Ross

  • JL

    John Lawson

  • AB

    Andrew Borg

  • JP

    James Pustejovsky

Links