Learning Beyond Experience: Generalizing to Unseen State Space with Reservoir Computing
By: Declan A. Norton, Yuanzhao Zhang, Michelle Girvan
Potential Business Impact:
Lets computers predict unseen future events.
Machine learning techniques offer an effective approach to modeling dynamical systems solely from observed data. However, without explicit structural priors -- built-in assumptions about the underlying dynamics -- these techniques typically struggle to generalize to aspects of the dynamics that are poorly represented in the training data. Here, we demonstrate that reservoir computing -- a simple, efficient, and versatile machine learning framework often used for data-driven modeling of dynamical systems -- can generalize to unexplored regions of state space without explicit structural priors. First, we describe a multiple-trajectory training scheme for reservoir computers that supports training across a collection of disjoint time series, enabling effective use of available training data. Then, applying this training scheme to multistable dynamical systems, we show that RCs trained on trajectories from a single basin of attraction can achieve out-of-domain generalization by capturing system behavior in entirely unobserved basins.
Similar Papers
Next-Generation Reservoir Computing for Dynamical Inference
Machine Learning (Stat)
Models complex systems from messy data.
Denoising and Reconstruction of Nonlinear Dynamics using Truncated Reservoir Computing
Machine Learning (CS)
Cleans messy data to reveal hidden patterns.
Boosting Reservoir Computing with Brain-inspired Adaptive Dynamics
Neural and Evolutionary Computing
Makes smart computers learn better, like a brain.