Score: 1

Masked Diffusion Language Models with Frequency-Informed Training

Published: September 5, 2025 | arXiv ID: 2509.05056v1

By: Despoina Kosmopoulou , Efthymios Georgiou , Vaggelis Dorovatas and more

Potential Business Impact:

Teaches computers language with less text.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We present a masked diffusion language modeling framework for data-efficient training for the BabyLM 2025 Challenge. Our approach applies diffusion training objectives to language modeling under strict data constraints, incorporating frequency-informed masking that prioritizes learning from rare tokens while maintaining theoretical validity. We explore multiple noise scheduling strategies, including two-mode approaches, and investigate different noise weighting schemes within the NELBO objective. We evaluate our method on the BabyLM benchmark suite, measuring linguistic competence, world knowledge, and human-likeness. Results show performance competitive to hybrid autoregressive-masked baselines, demonstrating that diffusion-based training offers a viable alternative for data-restricted language learning.

Country of Origin
🇨🇭 Switzerland

Page Count
9 pages

Category
Computer Science:
Computation and Language