Echo State Networks as State-Space Models: A Systems Perspective
By: Pradeep Singh, Balasubramanian Raman
Potential Business Impact:
Makes smart computers learn faster and better.
Echo State Networks (ESNs) are typically presented as efficient, readout-trained recurrent models, yet their dynamics and design are often guided by heuristics rather than first principles. We recast ESNs explicitly as state-space models (SSMs), providing a unified systems-theoretic account that links reservoir computing with classical identification and modern kernelized SSMs. First, we show that the echo-state property is an instance of input-to-state stability for a contractive nonlinear SSM and derive verifiable conditions in terms of leak, spectral scaling, and activation Lipschitz constants. Second, we develop two complementary mappings: (i) small-signal linearizations that yield locally valid LTI SSMs with interpretable poles and memory horizons; and (ii) lifted/Koopman random-feature expansions that render the ESN a linear SSM in an augmented state, enabling transfer-function and convolutional-kernel analyses. This perspective yields frequency-domain characterizations of memory spectra and clarifies when ESNs emulate structured SSM kernels. Third, we cast teacher forcing as state estimation and propose Kalman/EKF-assisted readout learning, together with EM for hyperparameters (leak, spectral radius, process/measurement noise) and a hybrid subspace procedure for spectral shaping under contraction constraints.
Similar Papers
Contraction, Criticality, and Capacity: A Dynamical-Systems Perspective on Echo-State Networks
Neural and Evolutionary Computing
Makes computers remember past information better.
Echo State Networks for Bitcoin Time Series Prediction
Machine Learning (CS)
Predicts crypto prices even when markets are crazy.
Towards a Comprehensive Theory of Reservoir Computing
Neural and Evolutionary Computing
Predicts how well computer memory systems work.