Back to Main Conference 2024
LREC-COLING 2024main

MoNMT: Modularly Leveraging Monolingual and Bilingual Knowledge for Neural Machine Translation

Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

DOI:10.63317/3hkhftnfzwr7

Abstract

The effective use of monolingual and bilingual knowledge represents a critical challenge within the neural machine translation (NMT) community. In this paper, we propose a modular strategy that facilitates the cooperation of these two types of knowledge in translation tasks, while avoiding the issue of catastrophic forgetting and exhibiting superior model generalization and robustness. Our model is comprised of three functionally independent modules: an encoding module, a decoding module, and a transferring module. The former two acquire large-scale monolingual knowledge via self-supervised learning, while the latter is trained on parallel data and responsible for transferring latent features between the encoding and decoding modules. Extensive experiments in multi-domain translation tasks indicate our model yields remarkable performance, with up to 7 BLEU improvements in out-of-domain tests over the conventional pretrain-and-finetune approach. Our codes are available at https://github.com/NLP2CT/MoNMT.

Details

Paper ID
lrec2024-main-1010
Pages
pp. 11560-11573
BibKey
pang-etal-2024-monmt
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
2522-2686
ISBN
979-10-95546-34-4
Conference
Joint International Conference on Computational Linguistics, Language Resources and Evaluation
Location
Turin, Italy
Date
20 May 2024 25 May 2024

Authors

  • JP

    Jianhui Pang

  • BY

    Baosong Yang

  • DW

    Derek F. Wong

  • DL

    Dayiheng Liu

  • XW

    Xiangpeng Wei

  • JX

    Jun Xie

  • LC

    Lidia S. Chao

Links