HomeLREC 2020WorkshopsCLSSTSlrec2020-ws-clssts-05
Back to CLSSTS 2020
LREC 2020workshop

Cross-lingual Information Retrieval with BERT

Proceedings of the workshop on Cross-Language Search and Summarization of Text and Speech (CLSSTS2020)

DOI:10.63317/4e4iev8e63nh

Abstract

Multiple neural language models have been developed recently, e.g., BERT and XLNet, and achieved impressive results in various NLP tasks including sentence classification, question answering and document ranking. In this paper, we explore the use of the popular bidirectional language model, BERT, to model and learn the relevance between English queries and foreign-language documents in the task of cross-lingual information retrieval. A deep relevance matching model based on BERT is introduced and trained by finetuning a pretrained multilingual BERT model with weak supervision, using home-made CLIR training data derived from parallel corpora. Experimental results of the retrieval of Lithuanian documents against short English queries show that our model is effective and outperforms the competitive baseline approaches.

Details

Paper ID
lrec2020-ws-clssts-05
Pages
pp. 26-31
BibKey
jiang-etal-2020-cross
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
N/A
ISBN
N/A
Workshop
Proceedings of the workshop on Cross-Language Search and Summarization of Text and Speech (CLSSTS2020)
Location
undefined, undefined
Date
11 May 2020 16 May 2020

Authors

  • ZJ

    Zhuolin Jiang

  • AE

    Amro El-Jaroudi

  • WH

    William Hartmann

  • DK

    Damianos Karakos

  • LZ

    Lingjun Zhao

Links