Score: 1

Entropic Confinement and Mode Connectivity in Overparameterized Neural Networks

Published: December 6, 2025 | arXiv ID: 2512.06297v1

By: Luca Di Carlo, Chase Goddard, David J. Schwab

BigTech Affiliations: Princeton University

Potential Business Impact:

Keeps AI focused on one good answer.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

Modern neural networks exhibit a striking property: basins of attraction in the loss landscape are often connected by low-loss paths, yet optimization dynamics generally remain confined to a single convex basin and rarely explore intermediate points. We resolve this paradox by identifying entropic barriers arising from the interplay between curvature variations along these paths and noise in optimization dynamics. Empirically, we find that curvature systematically rises away from minima, producing effective forces that bias noisy dynamics back toward the endpoints - even when the loss remains nearly flat. These barriers persist longer than energetic barriers, shaping the late-time localization of solutions in parameter space. Our results highlight the role of curvature-induced entropic forces in governing both connectivity and confinement in deep learning landscapes.

Country of Origin
🇺🇸 United States

Page Count
15 pages

Category
Computer Science:
Machine Learning (CS)