Score: 3

Towards a Comprehensive Theory of Reservoir Computing

Published: November 18, 2025 | arXiv ID: 2511.14484v1

By: Denis Kleyko , Christopher J. Kymn , E. Paxon Frady and more

BigTech Affiliations: University of California, Berkeley Intel

Potential Business Impact:

Predicts how well computer memory systems work.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

In reservoir computing, an input sequence is processed by a recurrent neural network, the reservoir, which transforms it into a spatial pattern that a shallow readout network can then exploit for tasks such as memorization and time-series prediction or classification. Echo state networks (ESN) are a model class in which the reservoir is a traditional artificial neural network. This class contains many model types, each with sets of hyperparameters. Selecting models and parameter settings for particular applications requires a theory for predicting and comparing performances. Here, we demonstrate that recent developments of perceptron theory can be used to predict the memory capacity and accuracy of a wide variety of ESN models, including reservoirs with linear neurons, sigmoid nonlinear neurons, different types of recurrent matrices, and different types of readout networks. Across thirty variants of ESNs, we show that empirical results consistently confirm the theory's predictions. As a practical demonstration, the theory is used to optimize memory capacity of an ESN in the entire joint parameter space. Further, guided by the theory, we propose a novel ESN model with a readout network that does not require training, and which outperforms earlier ESN models without training. Finally, we characterize the geometry of the readout networks in ESNs, which reveals that many ESN models exhibit a similar regular simplex geometry as has been observed in the output weights of deep neural networks.

Country of Origin
πŸ‡ΈπŸ‡ͺ πŸ‡ΊπŸ‡Έ United States, Sweden

Page Count
15 pages

Category
Computer Science:
Neural and Evolutionary Computing