Back to Main Conference 2026
LREC 2026main

Procrustes Analysis for Improving Language Model Merging

Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)

DOI:10.63317/3ywqnrqzviyt

Abstract

The availability of many fine-tuned neural language models for different tasks naturally leads to the question of whether it is worthwhile to combine them, particularly through parameter merging, which is the least resource-intensive option. Among the many existing methods, some focus on parameter alignment before actual merging. In this article, we propose a new method within this research area, based on Procrustes analysis. We evaluate this method for merging fine-tuned models for the same task, derived from the same encoder-based model. Considering nine tasks from the GLUE benchmark, three Named Entity Recognition tasks, and six reference merging methods, we show that our proposal can improve upon existing merging methods in most tested configurations.

Details

Paper ID
lrec2026-main-783
Pages
pp. 9988-9998
BibKey
ferret-2026-procrustes
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
978-2-493814-49-4
Conference
The Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Location
Palma, Mallorca, Spain
Date
11 May 2026 16 May 2026

Authors

  • OF

    Olivier Ferret

Links