Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects
By: Giuseppe Alessio D'Inverno , Zhiyuan Hu , Leo Davy and more
Potential Business Impact:
Neural networks have hidden, complex patterns.
Information propagation characterizes how input correlations evolve across layers in deep neural networks. This framework has been well studied using mean-field theory, which assumes infinitely wide networks. However, these assumptions break down for practical, finite-size networks. In this work, we study information propagation in randomly initialized neural networks with finite width and reveal that the boundary between ordered and chaotic regimes exhibits a fractal structure. This shows the fundamental complexity of neural network dynamics, in a setting that is independent of input data and optimization. To extend this analysis beyond multilayer perceptrons, we leverage recently introduced Fourier-based structured transforms, and show that information propagation in convolutional neural networks also follow the same behavior. Our investigation highlights the importance of finite network depth with respect to the tradeoff between separation and robustness.
Similar Papers
Functional Percolation: A Perspective on Criticality of Form and Function
Physics and Society
Lets networks process more information when connected.
Functional Percolation: A Perspective on Criticality of Form and Function
Physics and Society
Networks learn when they connect just right.
Information flow in multilayer perceptrons: an in-depth analysis
Information Theory
Helps computers learn better by tracking information flow.