Beyond the Rosetta Stone: Unification Forces in Generalization Dynamics
By: Carter Blum , Katja Filippova , Ann Yuan and more
Potential Business Impact:
Helps computers use knowledge across different languages.
Large language models (LLMs) struggle with cross-lingual knowledge transfer: they hallucinate when asked in one language about facts expressed in a different language during training. This work introduces a controlled setting to study the causes and dynamics of this phenomenon by training small Transformer models from scratch on synthetic multilingual datasets. We identify a learning phase wherein a model develops either separate or unified representations of the same facts across languages, and show that unification is essential for cross-lingual transfer. We also show that the degree of unification depends on mutual information between facts and training data language, and on how easy it is to extract that language. Based on these insights, we develop methods to modulate the level of cross-lingual transfer by manipulating data distribution and tokenization, and we introduce metrics and visualizations to formally characterize their effects on unification. Our work shows how controlled settings can shed light on pre-training dynamics and suggests new directions for improving cross-lingual transfer in LLMs.
Similar Papers
Beyond the Rosetta Stone: Unification Forces in Generalization Dynamics
Computation and Language
Helps computers understand facts in any language.
A Post-trainer's Guide to Multilingual Training Data: Uncovering Cross-lingual Transfer Dynamics
Computation and Language
Helps computers understand many languages better.
Rethinking Cross-lingual Gaps from a Statistical Viewpoint
Computation and Language
Makes computer translations more accurate.