Stochastic dynamics learning with state-space systems
By: Juan-Pablo Ortega, Florian Rossmannek
Potential Business Impact:
Makes computers remember past information better.
This work advances the theoretical foundations of reservoir computing (RC) by providing a unified treatment of fading memory and the echo state property (ESP) in both deterministic and stochastic settings. We investigate state-space systems, a central model class in time series learning, and establish that fading memory and solution stability hold generically -- even in the absence of the ESP -- offering a robust explanation for the empirical success of RC models without strict contractivity conditions. In the stochastic case, we critically assess stochastic echo states, proposing a novel distributional perspective rooted in attractor dynamics on the space of probability distributions, which leads to a rich and coherent theory. Our results extend and generalize previous work on non-autonomous dynamical systems, offering new insights into causality, stability, and memory in RC models. This lays the groundwork for reliable generative modeling of temporal data in both deterministic and stochastic regimes.
Similar Papers
Dynamics and Computational Principles of Echo State Networks: A Mathematical Perspective
Machine Learning (CS)
Teaches computers to learn from past events.
Contraction, Criticality, and Capacity: A Dynamical-Systems Perspective on Echo-State Networks
Neural and Evolutionary Computing
Makes computers remember past information better.
Echo State Networks as State-Space Models: A Systems Perspective
Machine Learning (CS)
Makes smart computers learn faster and better.