Full Integer Arithmetic Online Training for Spiking Neural Networks
By: Ismael Gomez, Guangzhi Tang
Potential Business Impact:
Trains brain-like computers faster and with less power.
Spiking Neural Networks (SNNs) are promising for neuromorphic computing due to their biological plausibility and energy efficiency. However, training methods like Backpropagation Through Time (BPTT) and Real Time Recurrent Learning (RTRL) remain computationally intensive. This work introduces an integer-only, online training algorithm using a mixed-precision approach to improve efficiency and reduce memory usage by over 60%. The method replaces floating-point operations with integer arithmetic to enable hardware-friendly implementation. It generalizes to Convolutional and Recurrent SNNs (CSNNs, RSNNs), showing versatility across architectures. Evaluations on MNIST and the Spiking Heidelberg Digits (SHD) dataset demonstrate that mixed-precision models achieve accuracy comparable to or better than full-precision baselines using 16-bit shadow and 8- or 12-bit inference weights. Despite some limitations in low-precision and deeper models, performance remains robust. In conclusion, the proposed integer-only online learning algorithm presents an effective solution for efficiently training SNNs, enabling deployment on resource-constrained neuromorphic hardware without sacrificing accuracy.
Similar Papers
A Scalable Hybrid Training Approach for Recurrent Spiking Neural Networks
Neural and Evolutionary Computing
Teaches computer brains to learn faster, using less memory.
A Self-Ensemble Inspired Approach for Effective Training of Binary-Weight Spiking Neural Networks
Neural and Evolutionary Computing
Makes AI learn faster with less power.
Multiplication-Free Parallelizable Spiking Neurons with Efficient Spatio-Temporal Dynamics
Neural and Evolutionary Computing
Makes brain-like computers learn faster, cheaper.