Score: 0

Curriculum Learning for LLM Pretraining: An Analysis of Learning Dynamics

Published: January 29, 2026 | arXiv ID: 2601.21698v1

By: Mohamed Elgaar, Hadi Amiri

Potential Business Impact:

Teaches computers better by changing learning order.

Business Areas:
E-Learning Education, Software

Curriculum learning changes the order of pre-training data, but it remains unclear whether it changes the learning trajectory or mainly reorders exposure over a fixed trajectory. We train Pythia models (14M-410M parameters) for 300B tokens under three linguistically motivated curricula-Age-of-Acquisition, word frequency, and Verb Variation (VV)-and compare each against Random ordering; at 1B parameters we compare Random and VV. Across orderings, training follows a shared sequence of latent phases, while curricula mainly change within-phase data exposure. In smaller models (up to 160M parameters), Random ordering exhibits higher gradient noise and stronger late-training output-head spectral saturation, alongside lower final accuracy; curricula reduce both effects at matched compute. At larger scales, saturation differences are smaller and curriculum gains shrink. We formalize the link between difficulty pacing and optimization stability in an idealized analysis based on gradient-variance control, and our results point to a practical takeaway: curricula help by stabilizing within-phase optimization rather than by creating new phases.

Country of Origin
🇺🇸 United States

Page Count
19 pages

Category
Computer Science:
Machine Learning (CS)