ADMM-Based Training for Spiking Neural Networks
By: Giovanni Perin , Cesare Bidini , Riccardo Mazzieri and more
Potential Business Impact:
Teaches brain-like computers to learn faster.
In recent years, spiking neural networks (SNNs) have gained momentum due to their high potential in time-series processing combined with minimal energy consumption. However, they still lack a dedicated and efficient training algorithm. The popular backpropagation with surrogate gradients, adapted from stochastic gradient descent (SGD)-derived algorithms, has several drawbacks when used as an optimizer for SNNs. Specifically, it suffers from low scalability and numerical imprecision. In this paper, we propose a novel SNN training method based on the alternating direction method of multipliers (ADMM). Our ADMM-based training aims to solve the problem of the SNN step function's non-differentiability. We formulate the problem, derive closed-form updates, and empirically show the optimizer's convergence properties, great potential, and possible new research directions to improve the method in a simulated proof-of-concept.
Similar Papers
Adaptive Gradient Learning for Spiking Neural Networks by Exploiting Membrane Potential Dynamics
Neural and Evolutionary Computing
Makes brain-like computers learn faster and better.
Random Feature Spiking Neural Networks
Machine Learning (CS)
Makes brain-like computers learn faster, using less power.
DS-ATGO: Dual-Stage Synergistic Learning via Forward Adaptive Threshold and Backward Gradient Optimization for Spiking Neural Networks
Neural and Evolutionary Computing
Makes brain-like computers learn better and use less power.