Back to Main Conference 2026
LREC 2026main

NRD: A Hybrid Disentanglement Framework for Mitigating Interference in Multilingual Machine Translation

Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)

DOI:10.63317/55wnhwvmezwx

Abstract

Negative interference from cross-lingual conflicting syntactic patterns is a primary obstacle in Multilingual Neural Machine Translation (MNMT). We trace this problem to the entanglement of transferable, universal semantics with non-transferable, language-specific syntactic structures. Existing methods, relying on disjoint training-only specialization or inference-only filtering, fail to fully resolve this fundamental entanglement. To address this, we propose NRD (Neuron Representation Disentanglement), a two-stage hybrid framework that couples training-time specialization with inference-time filtering. First, a Specialization Fine-tuning stage identifies functional neurons via a semantic-invariant activation-variance metric and reinforces intrinsic modularity through sparse updates. Second, a Dynamic Representation Filtering stage purifies semantic representations at inference by adaptively suppressing syntax-sensitive neurons, guided by each language’s pre-computed gradient consistency. On the OPUS-100 benchmark, NRD outperforms strong baselines, achieving an average gain of +1.9 BLEU on supervised directions. On the WMT-10 zero-shot benchmark, it obtains a substantial +7.1 BLEU, demonstrating robust cross-lingual generalization. These results provide strong evidence that our hybrid approach effectively purifies semantic representations by mitigating syntactic interference, paving the way for more robust cross-lingual generalization.

Details

Paper ID
lrec2026-main-677
Pages
pp. 8577-8586
BibKey
zhang-etal-2026-nrd
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
978-2-493814-49-4
Conference
The Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Location
Palma, Mallorca, Spain
Date
11 May 2026 16 May 2026

Authors

  • JZ

    Jiarui Zhang

  • YD

    Yifan Deng

Links