This site is dedicated solely to the submission of contributions
24-26 Feb 2025 Paris (France)
LYRA - Language verY Rare for All
Ibrahim Merad  1@  , Amos Wolf  1@  , Ziad Mazzawi  1@  , Yannick Léo  1, *@  
1 : Emerton Data
Emerton Data
16 avenue hoche -  France
* : Corresponding author

In the quest to overcome language barriers, encoder-decoder models like NLLB have expanded machine translation to rare languages, with some models (e.g., NLLB 1.3B) even trainable on a single GPU. While general-purpose LLMs perform well in translation, open LLMs prove highly competitive when fine-tuned for specific tasks involving unknown corpora. We introduce LYRA (Language verY Rare for All), a novel approach that combines open LLM fine-tuning, retrieval-augmented generation (RAG), and transfer learning from related high-resource languages. This study is exclusively focused on single-GPU training to facilitate ease of adoption. Our study focuses on two-way translation between French and Monégasque, a rare language unsupported by existing translation tools due to limited corpus availability. Our results demonstrate LYRA's effectiveness, frequently surpassing and consistently matching state-of-the-art encoder-decoder models in rare language translation.



  • Poster
  • Picture
Loading... Loading...