Lightweight Cross-Lingual Federated Prompt Tuning for Low-Resource Languages
Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Abstract
Multilingual NLP faces challenges of data heterogeneity, privacy, and limited computational resources, especially for low-resource languages. Centralised methods risk privacy breaches, while federated learning struggles with communication overhead and poor cross-lingual generalisation. We propose FLiP (Federated Lightweight Prompt-tuning), a privacy-preserving, resource-efficient, generalizable framework integrating prompt-based learning with federated optimisation. FLiP eliminates communication overhead, reduces trainable parameters to 16%, and cuts GPU memory use by 90%. Experiments show superior generalisation and efficiency under both IID and Non-IID settings, establishing FLiP as a scalable, privacy-aware solution for multilingual NLP, particularly in low-resource and indigenous language contexts.