Learning at the Speed of Physics: Equilibrium Propagation on Oscillator Ising Machines
By: Alex Gower
Potential Business Impact:
Computers learn faster by copying how nature works.
Physical systems that naturally perform energy descent offer a direct route to accelerating machine learning. Oscillator Ising Machines (OIMs) exemplify this idea: their GHz-frequency dynamics mirror both the optimization of energy-based models (EBMs) and gradient descent on loss landscapes, while intrinsic noise corresponds to Langevin dynamics - supporting sampling as well as optimization. Equilibrium Propagation (EP) unifies these processes into descent on a single total energy landscape, enabling local learning rules without global backpropagation. We show that EP on OIMs achieves competitive accuracy ($\sim 97.2 \pm 0.1 \%$ on MNIST, $\sim 88.0 \pm 0.1 \%$ on Fashion-MNIST), while maintaining robustness under realistic hardware constraints such as parameter quantization and phase noise. These results establish OIMs as a fast, energy-efficient substrate for neuromorphic learning, and suggest that EBMs - often bottlenecked by conventional processors - may find practical realization on physical hardware whose dynamics directly perform their optimization.
Similar Papers
Equilibrium Propagation Without Limits
Machine Learning (CS)
Makes AI learn faster with bigger, bolder steps.
Learning Dynamics in Memristor-Based Equilibrium Propagation
Machine Learning (CS)
Makes computers learn faster and use less power.
Training and synchronizing oscillator networks with Equilibrium Propagation
Disordered Systems and Neural Networks
Trains computer "brains" to recognize pictures.