Residual Reservoir Memory Networks
By: Matteo Pinna, Andrea Ceni, Claudio Gallicchio
Potential Business Impact:
Helps computers remember long past events better.
We introduce a novel class of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) paradigm, called Residual Reservoir Memory Networks (ResRMNs). ResRMN combines a linear memory reservoir with a non-linear reservoir, where the latter is based on residual orthogonal connections along the temporal dimension for enhanced long-term propagation of the input. The resulting reservoir state dynamics are studied through the lens of linear stability analysis, and we investigate diverse configurations for the temporal residual connections. The proposed approach is empirically assessed on time-series and pixel-level 1-D classification tasks. Our experimental results highlight the advantages of the proposed approach over other conventional RC models.
Similar Papers
Deep Residual Echo State Networks: exploring residual orthogonal connections in untrained Recurrent Neural Networks
Machine Learning (CS)
Helps computers remember long-term information better.
Reservoir Computing: A New Paradigm for Neural Networks
Machine Learning (CS)
Makes computers learn from messy, changing information.
Neuronal correlations shape the scaling behavior of memory capacity and nonlinear computational capability of reservoir recurrent neural networks
Disordered Systems and Neural Networks
Makes computers learn faster with more brain cells.