Plug-and-Play Homeostatic Spark: Zero-Cost Acceleration for SNN Training Across Paradigms
By: Rui Chen , Xingyu Chen , Yaoqing Hu and more
Potential Business Impact:
Makes brain-like computers learn much faster.
Spiking neural networks offer event driven computation, sparse activation, and hardware efficiency, yet training often converges slowly and lacks stability. We present Adaptive Homeostatic Spiking Activity Regulation (AHSAR), an extremely simple plug in and training paradigm agnostic method that stabilizes optimization and accelerates convergence without changing the model architecture, loss, or gradients. AHSAR introduces no trainable parameters. It maintains a per layer homeostatic state during the forward pass, maps centered firing rate deviations to threshold scales through a bounded nonlinearity, uses lightweight cross layer diffusion to avoid sharp imbalance, and applies a slow across epoch global gain that combines validation progress with activity energy to tune the operating point. The computational cost is negligible. Across diverse training methods, SNN architectures of different depths, widths, and temporal steps, and both RGB and DVS datasets, AHSAR consistently improves strong baselines and enhances out of distribution robustness. These results indicate that keeping layer activity within a moderate band is a simple and effective principle for scalable and efficient SNN training.
Similar Papers
DS-ATGO: Dual-Stage Synergistic Learning via Forward Adaptive Threshold and Backward Gradient Optimization for Spiking Neural Networks
Neural and Evolutionary Computing
Makes brain-like computers learn better and use less power.
Spatiotemporal Radar Gesture Recognition with Hybrid Spiking Neural Networks: Balancing Accuracy and Efficiency
Neural and Evolutionary Computing
Saves energy for radar that sees people.
Cannistraci-Hebb Training on Ultra-Sparse Spiking Neural Networks
Neural and Evolutionary Computing
Makes computer brains use less energy.