Score: 2

Cross Knowledge Distillation between Artificial and Spiking Neural Networks

Published: July 12, 2025 | arXiv ID: 2507.09269v1

By: Shuhan Ye , Yuanbin Qian , Chong Wang and more

Potential Business Impact:

Teaches computers to learn better from different data.

Business Areas:
Image Recognition Data and Analytics, Software

Recently, Spiking Neural Networks (SNNs) have demonstrated rich potential in computer vision domain due to their high biological plausibility, event-driven characteristic and energy-saving efficiency. Still, limited annotated event-based datasets and immature SNN architectures result in their performance inferior to that of Artificial Neural Networks (ANNs). To enhance the performance of SNNs on their optimal data format, DVS data, we explore using RGB data and well-performing ANNs to implement knowledge distillation. In this case, solving cross-modality and cross-architecture challenges is necessary. In this paper, we propose cross knowledge distillation (CKD), which not only leverages semantic similarity and sliding replacement to mitigate the cross-modality challenge, but also uses an indirect phased knowledge distillation to mitigate the cross-architecture challenge. We validated our method on main-stream neuromorphic datasets, including N-Caltech101 and CEP-DVS. The experimental results show that our method outperforms current State-of-the-Art methods. The code will be available at https://github.com/ShawnYE618/CKD

Repos / Data Links

Page Count
6 pages

Category
Computer Science:
CV and Pattern Recognition