Score: 1

Plug-and-Play Homeostatic Spark: Zero-Cost Acceleration for SNN Training Across Paradigms

Published: December 4, 2025 | arXiv ID: 2512.05015v1

By: Rui Chen , Xingyu Chen , Yaoqing Hu and more

Potential Business Impact:

Makes brain-like computers learn much faster.

Business Areas:
Autonomous Vehicles Transportation

Spiking neural networks offer event driven computation, sparse activation, and hardware efficiency, yet training often converges slowly and lacks stability. We present Adaptive Homeostatic Spiking Activity Regulation (AHSAR), an extremely simple plug in and training paradigm agnostic method that stabilizes optimization and accelerates convergence without changing the model architecture, loss, or gradients. AHSAR introduces no trainable parameters. It maintains a per layer homeostatic state during the forward pass, maps centered firing rate deviations to threshold scales through a bounded nonlinearity, uses lightweight cross layer diffusion to avoid sharp imbalance, and applies a slow across epoch global gain that combines validation progress with activity energy to tune the operating point. The computational cost is negligible. Across diverse training methods, SNN architectures of different depths, widths, and temporal steps, and both RGB and DVS datasets, AHSAR consistently improves strong baselines and enhances out of distribution robustness. These results indicate that keeping layer activity within a moderate band is a simple and effective principle for scalable and efficient SNN training.

Country of Origin
🇨🇳 China

Page Count
12 pages

Category
Computer Science:
Neural and Evolutionary Computing