Slow Transition to Low-Dimensional Chaos in Heavy-Tailed Recurrent Neural Networks
By: Yi Xie, Stefan Mihalas, Łukasz Kuśmierz
Potential Business Impact:
Brain-like networks learn better, but are simpler.
Growing evidence suggests that synaptic weights in the brain follow heavy-tailed distributions, yet most theoretical analyses of recurrent neural networks (RNNs) assume Gaussian connectivity. We systematically study the activity of RNNs with random weights drawn from biologically plausible L\'evy alpha-stable distributions. While mean-field theory for the infinite system predicts that the quiescent state is always unstable -- implying ubiquitous chaos -- our finite-size analysis reveals a sharp transition between quiescent and chaotic dynamics. We theoretically predict the gain at which the system transitions from quiescent to chaotic dynamics, and validate it through simulations. Compared to Gaussian networks, heavy-tailed RNNs exhibit a broader parameter regime near the edge of chaos, namely a slow transition to chaos. However, this robustness comes with a tradeoff: heavier tails reduce the Lyapunov dimension of the attractor, indicating lower effective dimensionality. Our results reveal a biologically aligned tradeoff between the robustness of dynamics near the edge of chaos and the richness of high-dimensional neural activity. By analytically characterizing the transition point in finite-size networks -- where mean-field theory breaks down -- we provide a tractable framework for understanding dynamics in realistically sized, heavy-tailed neural circuits.
Similar Papers
Generative System Dynamics in Recurrent Neural Networks
Machine Learning (CS)
Makes computer memory remember longer and better.
An Analytical Characterization of Sloppiness in Neural Networks: Insights from Linear Models
Machine Learning (CS)
Finds simple patterns in how computer brains learn.
Propagation of Chaos in One-hidden-layer Neural Networks beyond Logarithmic Time
Machine Learning (Stat)
Makes AI learn faster with fewer parts.