On the emergence of numerical instabilities in Next Generation Reservoir Computing
By: Edmilson Roque dos Santos, Erik Bollt
Potential Business Impact:
Makes computer predictions more stable and reliable.
Next Generation Reservoir Computing (NGRC) is a low-cost machine learning method for forecasting chaotic time series from data. However, ensuring the dynamical stability of NGRC models during autonomous prediction remains a challenge. In this work, we uncover a key connection between the numerical conditioning of the NGRC feature matrix -- formed by polynomial evaluations on time-delay coordinates -- and the long-term NGRC dynamics. Merging tools from numerical linear algebra and ergodic theory of dynamical systems, we systematically study how the feature matrix conditioning varies across hyperparameters. We demonstrate that the NGRC feature matrix tends to be ill-conditioned for short time lags and high-degree polynomials. Ill-conditioning amplifies sensitivity to training data perturbations, which can produce unstable NGRC dynamics. We evaluate the impact of different numerical algorithms (Cholesky, SVD, and LU) for solving the regularized least-squares problem.
Similar Papers
Adaptive control for multi-scale stochastic dynamical systems with stochastic next generation reservoir computing
Dynamical Systems
Controls brain signals to stop seizures.
Next-generation reservoir computing validated by classification task
Machine Learning (CS)
Teaches computers to sort and predict information.
Next-Generation Reservoir Computing for Dynamical Inference
Machine Learning (Stat)
Models complex systems from messy data.