The Role of Mixed-Language Documents for Multilingual Large Language Model Pretraining
By: Jiandong Shao , Raphael Tang , Crystina Zhang and more
Potential Business Impact:
Makes computers translate languages better with specific data.
Multilingual large language models achieve impressive cross-lingual performance despite largely monolingual pretraining. While bilingual data in pretraining corpora is widely believed to enable these abilities, details of its contributions remain unclear. We investigate this question by pretraining models from scratch under controlled conditions, comparing the standard web corpus with a monolingual-only version that removes all multilingual documents. Despite constituting only 2% of the corpus, removing bilingual data causes translation performance to drop 56% in BLEU, while behaviour on cross-lingual QA and general reasoning tasks remains stable, with training curves largely overlapping the baseline. To understand this asymmetry, we categorize bilingual data into parallel (14%), code-switching (72%), and miscellaneous documents (14%) based on the semantic relevance of content in different languages. We then conduct granular ablations by reintroducing parallel or code-switching data into the monolingual-only corpus. Our experiments reveal that parallel data almost fully restores translation performance (91% of the unfiltered baseline), whereas code-switching contributes minimally. Other cross-lingual tasks remain largely unaffected by either type. These findings reveal that translation critically depends on systematic token-level alignments from parallel data, whereas cross-lingual understanding and reasoning appear to be achievable even without bilingual data.
Similar Papers
Revisiting Multilingual Data Mixtures in Language Model Pretraining
Computation and Language
Makes computers understand many languages better.
Massively Multilingual Adaptation of Large Language Models Using Bilingual Translation Data
Computation and Language
Helps computers understand many more languages.
Just Go Parallel: Improving the Multilingual Capabilities of Large Language Models
Computation and Language
Adds more languages to computer translators.