Score: 0

Transient learning dynamics drive escape from sharp valleys in Stochastic Gradient Descent

Published: January 16, 2026 | arXiv ID: 2601.10962v1

By: Ning Yang , Yikuan Zhang , Qi Ouyang and more

Potential Business Impact:

Makes AI learn better by finding smoother paths.

Business Areas:
Quantum Computing Science and Engineering

Stochastic gradient descent (SGD) is central to deep learning, yet the dynamical origin of its preference for flatter, more generalizable solutions remains unclear. Here, by analyzing SGD learning dynamics, we identify a nonequilibrium mechanism governing solution selection. Numerical experiments reveal a transient exploratory phase in which SGD trajectories repeatedly escape sharp valleys and transition toward flatter regions of the loss landscape. By using a tractable physical model, we show that the SGD noise reshapes the landscape into an effective potential that favors flat solutions. Crucially, we uncover a transient freezing mechanism: as training proceeds, growing energy barriers suppress inter-valley transitions and ultimately trap the dynamics within a single basin. Increasing the SGD noise strength delays this freezing, which enhances convergence to flatter minima. Together, these results provide a unified physical framework linking learning dynamics, loss-landscape geometry, and generalization, and suggest principles for the design of more effective optimization algorithms.

Page Count
15 pages

Category
Computer Science:
Machine Learning (CS)