Back to Main Conference 2024
LREC-COLING 2024main

A Family of Pretrained Transformer Language Models for Russian

Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

DOI:10.63317/2y5o98mk2z5m

Abstract

Transformer language models (LMs) are fundamental to NLP research methodologies and applications in various languages. However, developing such models specifically for the Russian language has received little attention. This paper introduces a collection of 13 Russian Transformer LMs, which spans encoder (ruBERT, ruRoBERTa, ruELECTRA), decoder (ruGPT-3), and encoder-decoder (ruT5, FRED-T5) architectures. We provide a report on the model architecture design and pretraining, and the results of evaluating their generalization abilities on Russian language understanding and generation datasets and benchmarks. By pretraining and releasing these specialized Transformer LMs, we aim to broaden the scope of the NLP research directions and enable the development of industrial solutions for the Russian language.

Details

Paper ID
lrec2024-main-0045
Pages
pp. 507-524
BibKey
zmitrovich-etal-2024-family
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
2522-2686
ISBN
979-10-95546-34-4
Conference
Joint International Conference on Computational Linguistics, Language Resources and Evaluation
Location
Turin, Italy
Date
20 May 2024 25 May 2024

Authors

  • DZ

    Dmitry Zmitrovich

  • AA

    Aleksandr Abramov

  • AK

    Andrey Kalmykov

  • VK

    Vitaly Kadulin

  • MT

    Maria Tikhonova

  • ET

    Ekaterina Taktasheva

  • DA

    Danil Astafurov

  • MB

    Mark Baushenko

  • AS

    Artem Snegirev

  • TS

    Tatiana Shavrina

  • SM

    Sergei S. Markov

  • VM

    Vladislav Mikhailov

  • AF

    Alena Fenogenova

Links