Score: 0

Dynamical Learning in Deep Asymmetric Recurrent Neural Networks

Published: September 5, 2025 | arXiv ID: 2509.05041v1

By: Davide Badalotti , Carlo Baldassi , Marc Mézard and more

Potential Business Impact:

Learns from examples without needing a teacher.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

We show that asymmetric deep recurrent neural networks, enhanced with additional sparse excitatory couplings, give rise to an exponentially large, dense accessible manifold of internal representations which can be found by different algorithms, including simple iterative dynamics. Building on the geometrical properties of the stable configurations, we propose a distributed learning scheme in which input-output associations emerge naturally from the recurrent dynamics, without any need of gradient evaluation. A critical feature enabling the learning process is the stability of the configurations reached at convergence, even after removal of the supervisory output signal. Extensive simulations demonstrate that this approach performs competitively on standard AI benchmarks. The model can be generalized in multiple directions, both computational and biological, potentially contributing to narrowing the gap between AI and computational neuroscience.

Page Count
26 pages

Category
Condensed Matter:
Disordered Systems and Neural Networks