Neuronal correlations shape the scaling behavior of memory capacity and nonlinear computational capability of reservoir recurrent neural networks
By: Shotaro Takasu, Toshio Aoyagi
Potential Business Impact:
Makes computers learn faster with more brain cells.
Reservoir computing is a powerful framework for real-time information processing, characterized by its high computational ability and quick learning, with applications ranging from machine learning to biological systems. In this paper, we investigate how the computational ability of reservoir recurrent neural networks (RNNs) scales with an increasing number of readout neurons. First, we demonstrate that the memory capacity of a reservoir RNN scales sublinearly with the number of readout neurons. To elucidate this observation, we develop a theoretical framework for analytically deriving memory capacity that incorporates the effect of neuronal correlations, which have been ignored in prior theoretical work for analytical simplicity. Our theory successfully relates the sublinear scaling of memory capacity to the strength of neuronal correlations. Furthermore, we show this principle holds across diverse types of RNNs, even those beyond the direct applicability of our theory. Next, we numerically investigate the scaling behavior of nonlinear computational ability, which, alongside memory capacity, is crucial for overall computational performance. Our numerical simulations reveal that as memory capacity growth becomes sublinear, increasing the number of readout neurons successively enables nonlinear processing at progressively higher polynomial orders. Our theoretical framework suggests that neuronal correlations govern not only memory capacity but also the sequential growth of nonlinear computational capabilities. Our findings establish a foundation for designing scalable and cost-effective reservoir computing, providing novel insights into the interplay among neuronal correlations, linear memory, and nonlinear processing.
Similar Papers
Illuminating the Black Box of Reservoir Computing
Neural and Evolutionary Computing
Makes computers learn with fewer parts.
Reservoir Computing: A New Paradigm for Neural Networks
Machine Learning (CS)
Makes computers learn from messy, changing information.
Residual Reservoir Memory Networks
Machine Learning (CS)
Helps computers remember long past events better.