Contraction, Criticality, and Capacity: A Dynamical-Systems Perspective on Echo-State Networks
By: Pradeep Singh, Lavanya Sankaranarayanan, Balasubramanian Raman
Potential Business Impact:
Makes computers remember past information better.
Echo-State Networks (ESNs) distil a key neurobiological insight: richly recurrent but fixed circuitry combined with adaptive linear read-outs can transform temporal streams with remarkable efficiency. Yet fundamental questions about stability, memory and expressive power remain fragmented across disciplines. We present a unified, dynamical-systems treatment that weaves together functional analysis, random attractor theory and recent neuroscientific findings. First, on compact multivariate input alphabets we prove that the Echo-State Property (wash-out of initial conditions) together with global Lipschitz dynamics necessarily yields the Fading-Memory Property (geometric forgetting of remote inputs). Tight algebraic tests translate activation-specific Lipschitz constants into certified spectral-norm bounds, covering both saturating and rectifying nonlinearities. Second, employing a Stone-Weierstrass strategy we give a streamlined proof that ESNs with polynomial reservoirs and linear read-outs are dense in the Banach space of causal, time-invariant fading-memory filters, extending universality to stochastic inputs. Third, we quantify computational resources via memory-capacity spectrum, show how topology and leak rate redistribute delay-specific capacities, and link these trade-offs to Lyapunov spectra at the \textit{edge of chaos}. Finally, casting ESNs as skew-product random dynamical systems, we establish existence of singleton pullback attractors and derive conditional Lyapunov bounds, providing a rigorous analogue to cortical criticality. The analysis yields concrete design rules-spectral radius, input gain, activation choice-grounded simultaneously in mathematics and neuroscience, and clarifies why modest-sized reservoirs often rival fully trained recurrent networks in practice.
Similar Papers
Echo State Networks as State-Space Models: A Systems Perspective
Machine Learning (CS)
Makes smart computers learn faster and better.
Towards a Comprehensive Theory of Reservoir Computing
Neural and Evolutionary Computing
Predicts how well computer memory systems work.
Reservoir Network with Structural Plasticity for Human Activity Recognition
Machine Learning (CS)
Lets small computers learn and predict things locally.