Score: 1

Beyond the Rosetta Stone: Unification Forces in Generalization Dynamics

Published: August 14, 2025 | arXiv ID: 2508.11017v2

By: Carter Blum , Katja Filippova , Ann Yuan and more

BigTech Affiliations: Google

Potential Business Impact:

Helps computers use knowledge across different languages.

Large language models (LLMs) struggle with cross-lingual knowledge transfer: they hallucinate when asked in one language about facts expressed in a different language during training. This work introduces a controlled setting to study the causes and dynamics of this phenomenon by training small Transformer models from scratch on synthetic multilingual datasets. We identify a learning phase wherein a model develops either separate or unified representations of the same facts across languages, and show that unification is essential for cross-lingual transfer. We also show that the degree of unification depends on mutual information between facts and training data language, and on how easy it is to extract that language. Based on these insights, we develop methods to modulate the level of cross-lingual transfer by manipulating data distribution and tokenization, and we introduce metrics and visualizations to formally characterize their effects on unification. Our work shows how controlled settings can shed light on pre-training dynamics and suggests new directions for improving cross-lingual transfer in LLMs.

Country of Origin
🇺🇸 United States

Page Count
22 pages

Category
Computer Science:
Computation and Language