Score: 0

Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects

Published: August 5, 2025 | arXiv ID: 2508.03222v1

By: Giuseppe Alessio D'Inverno , Zhiyuan Hu , Leo Davy and more

Potential Business Impact:

Neural networks have hidden, complex patterns.

Information propagation characterizes how input correlations evolve across layers in deep neural networks. This framework has been well studied using mean-field theory, which assumes infinitely wide networks. However, these assumptions break down for practical, finite-size networks. In this work, we study information propagation in randomly initialized neural networks with finite width and reveal that the boundary between ordered and chaotic regimes exhibits a fractal structure. This shows the fundamental complexity of neural network dynamics, in a setting that is independent of input data and optimization. To extend this analysis beyond multilayer perceptrons, we leverage recently introduced Fourier-based structured transforms, and show that information propagation in convolutional neural networks also follow the same behavior. Our investigation highlights the importance of finite network depth with respect to the tradeoff between separation and robustness.

Country of Origin
🇮🇹 Italy

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)