Score: 0

Dynamical stability for dense patterns in discrete attractor neural networks

Published: July 14, 2025 | arXiv ID: 2507.10383v1

By: Uri Cohen, Máté Lengyel

Potential Business Impact:

Makes memories in brains more stable.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

Neural networks storing multiple discrete attractors are canonical models of biological memory. Previously, the dynamical stability of such networks could only be guaranteed under highly restrictive conditions. Here, we derive a theory of the local stability of discrete fixed points in a broad class of networks with graded neural activities and in the presence of noise. By directly analyzing the bulk and outliers of the Jacobian spectrum, we show that all fixed points are stable below a critical load that is distinct from the classical \textit{critical capacity} and depends on the statistics of neural activities in the fixed points as well as the single-neuron activation function. Our analysis highlights the computational benefits of threshold-linear activation and sparse-like patterns.

Page Count
12 pages

Category
Condensed Matter:
Disordered Systems and Neural Networks