Score: 1

Scaling Equilibrium Propagation to Deeper Neural Network Architectures

Published: September 30, 2025 | arXiv ID: 2509.26003v1

By: Sankar Vinayak. E. P, Gopalakrishnan Srinivasan

Potential Business Impact:

Makes brain-like computers learn like real ones.

Business Areas:
Quantum Computing Science and Engineering

Equilibrium propagation has been proposed as a biologically plausible alternative to the backpropagation algorithm. The local nature of gradient computations, combined with the use of convergent RNNs to reach equilibrium states, make this approach well-suited for implementation on neuromorphic hardware. However, previous studies on equilibrium propagation have been restricted to networks containing only dense layers or relatively small architectures with a few convolutional layers followed by a final dense layer. These networks have a significant gap in accuracy compared to similarly sized feedforward networks trained with backpropagation. In this work, we introduce the Hopfield-Resnet architecture, which incorporates residual (or skip) connections in Hopfield networks with clipped $\mathrm{ReLU}$ as the activation function. The proposed architectural enhancements enable the training of networks with nearly twice the number of layers reported in prior works. For example, Hopfield-Resnet13 achieves 93.92\% accuracy on CIFAR-10, which is $\approx$3.5\% higher than the previous best result and comparable to that provided by Resnet13 trained using backpropagation.

Country of Origin
🇮🇳 India

Repos / Data Links

Page Count
7 pages

Category
Computer Science:
Neural and Evolutionary Computing