Back to Main Conference 2024
LREC-COLING 2024main

On the Way to Lossless Compression of Language Transformers: Exploring Cross-Domain Properties of Quantization

Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

DOI:10.63317/55jabgg3m92d

Abstract

Modern Transformers achieved impressive results on various Natural Language Processing tasks over the last few years. The one downside of this success is the size of these models. Huge capacity, which sometimes surpasses billions of parameters, improves generalization abilities, but makes it difficult to employ. Developing field of model compression seeks to reduce the model size and inference latency. This research focuses on one of the compression techniques — Post-Training Quantization. We present a methodology to effectively quantize at least 95% of Transformer weights and corresponding activations to INT8 without any access to task-specific data so the drop in performance does not exceed 0.02%. Furthermore, we provide intriguing observations that reflect cross-domain nature of some of the quantization properties.

Details

Paper ID
lrec2024-main-1089
Pages
pp. 12435-12442
BibKey
martynov-etal-2024-way
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
2522-2686
ISBN
979-10-95546-34-4
Conference
Joint International Conference on Computational Linguistics, Language Resources and Evaluation
Location
Turin, Italy
Date
20 May 2024 25 May 2024

Authors

  • NM

    Nikita Martynov

  • AG

    Aleksei Goncharov

  • GK

    Gleb Kumichev

  • EE

    Evgeniy Egorov

  • SP

    Stanislav Vladimirovich Pavlov

  • MD

    Mikhail Sergeevich Durinov

  • AZ

    Aleksandr Sergeevich Zuev

  • EF

    Egor Anatolievich Filimonov

Links