Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement
By: Shu Yang , Chengting Yu , Lei Liu and more
Potential Business Impact:
Teaches computers to learn faster, like brains.
Spiking Neural Networks (SNNs) have garnered considerable attention as a potential alternative to Artificial Neural Networks (ANNs). Recent studies have highlighted SNNs' potential on large-scale datasets. For SNN training, two main approaches exist: direct training and ANN-to-SNN (ANN2SNN) conversion. To fully leverage existing ANN models in guiding SNN learning, either direct ANN-to-SNN conversion or ANN-SNN distillation training can be employed. In this paper, we propose an ANN-SNN distillation framework from the ANN-to-SNN perspective, designed with a block-wise replacement strategy for ANN-guided learning. By generating intermediate hybrid models that progressively align SNN feature spaces to those of ANN through rate-based features, our framework naturally incorporates rate-based backpropagation as a training method. Our approach achieves results comparable to or better than state-of-the-art SNN distillation methods, showing both training and learning efficiency.
Similar Papers
Bridge the Gap between SNN and ANN for Image Restoration
CV and Pattern Recognition
Makes AI picture cleaners use much less power.
Hybrid Layer-Wise ANN-SNN With Surrogate Spike Encoding-Decoding Structure
Neural and Evolutionary Computing
Makes smart computers use less power.
Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment
Machine Learning (CS)
Lets brain-like computers learn better and change easily.