Score: 0

Multi-step Predictive Coding Leads To Simplicity Bias

Published: November 12, 2025 | arXiv ID: 2511.09290v1

By: Aviv Ratzon, Omri Barak

Potential Business Impact:

Deep AI learns world's hidden rules.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Predictive coding is a framework for understanding the formation of low-dimensional internal representations mirroring the environment's latent structure. The conditions under which such representations emerge remain unclear. In this work, we investigate how the prediction horizon and network depth shape the solutions of predictive coding tasks. Using a minimal abstract setting inspired by prior work, we show empirically and theoretically that sufficiently deep networks trained with multi-step prediction horizons consistently recover the underlying latent structure, a phenomenon explained through the Ordinary Least Squares estimator structure and biases in learning dynamics. We then extend these insights to nonlinear networks and complex datasets, including piecewise linear functions, MNIST, multiple latent states and higher dimensional state geometries. Our results provide a principled understanding of when and why predictive coding induces structured representations, bridging the gap between empirical observations and theoretical foundations.

Country of Origin
🇮🇱 Israel

Page Count
16 pages

Category
Computer Science:
Machine Learning (CS)