In2x at WMT25 Translation Task
By: Lei Pang , Hanyi Mao , Quanjia Xiao and more
Potential Business Impact:
Helps computers translate rare languages well.
This paper presents the open-system submission by the In2x research team for the WMT25 General Machine Translation Shared Task. Our submission focuses on Japanese-related translation tasks, aiming to explore a generalizable paradigm for extending large language models (LLMs) to other languages. This paradigm encompasses aspects such as data construction methods and reward model design. The ultimate goal is to enable large language model systems to achieve exceptional performance in low-resource or less commonly spoken languages.
Similar Papers
Preliminary Ranking of WMT25 General Machine Translation Systems
Computation and Language
Helps computers translate languages more accurately.
Beyond English: Toward Inclusive and Scalable Multilingual Machine Translation with LLMs
Computation and Language
Translates 60 languages better, even Chinese.
LLaMAX2: Your Translation-Enhanced Model also Performs Well in Reasoning
Computation and Language
Makes computer translators better at thinking.