Towards a Comprehensive Theory of Reservoir Computing
By: Denis Kleyko , Christopher J. Kymn , E. Paxon Frady and more
Potential Business Impact:
Predicts how well computer memory systems work.
In reservoir computing, an input sequence is processed by a recurrent neural network, the reservoir, which transforms it into a spatial pattern that a shallow readout network can then exploit for tasks such as memorization and time-series prediction or classification. Echo state networks (ESN) are a model class in which the reservoir is a traditional artificial neural network. This class contains many model types, each with sets of hyperparameters. Selecting models and parameter settings for particular applications requires a theory for predicting and comparing performances. Here, we demonstrate that recent developments of perceptron theory can be used to predict the memory capacity and accuracy of a wide variety of ESN models, including reservoirs with linear neurons, sigmoid nonlinear neurons, different types of recurrent matrices, and different types of readout networks. Across thirty variants of ESNs, we show that empirical results consistently confirm the theory's predictions. As a practical demonstration, the theory is used to optimize memory capacity of an ESN in the entire joint parameter space. Further, guided by the theory, we propose a novel ESN model with a readout network that does not require training, and which outperforms earlier ESN models without training. Finally, we characterize the geometry of the readout networks in ESNs, which reveals that many ESN models exhibit a similar regular simplex geometry as has been observed in the output weights of deep neural networks.
Similar Papers
Reservoir Network with Structural Plasticity for Human Activity Recognition
Machine Learning (CS)
Lets small computers learn and predict things locally.
Contraction, Criticality, and Capacity: A Dynamical-Systems Perspective on Echo-State Networks
Neural and Evolutionary Computing
Makes computers remember past information better.
Deep Residual Echo State Networks: exploring residual orthogonal connections in untrained Recurrent Neural Networks
Machine Learning (CS)
Helps computers remember long-term information better.