Score: 1

Mask and You Shall Receive: Optimizing Masked Language Modeling For Pretraining BabyLMs

Published: October 23, 2025 | arXiv ID: 2510.20475v1

By: Lukas Edman, Alexander Fraser

Potential Business Impact:

Teaches computers to understand words better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We describe our strategy for the 2025 edition of the BabyLM Challenge. Our main contribution is that of an improved form of Masked Language Modeling (MLM), which adapts the probabilities of the tokens masked according to the model's ability to predict them. The results show a substantial increase in performance on (Super)GLUE tasks over the standard MLM. We also incorporate sub-token embeddings, finding that this increases the model's morphological generalization capabilities. Our submission beats the baseline in the strict-small track.

Country of Origin
🇩🇪 Germany

Repos / Data Links

Page Count
9 pages

Category
Computer Science:
Computation and Language