General Self-Prediction Enhancement for Spiking Neurons
By: Zihan Huang , Zijie Xu , Yihan Huang and more
Potential Business Impact:
Brain-like computers learn better and faster.
Spiking Neural Networks (SNNs) are highly energy-efficient due to event-driven, sparse computation, but their training is challenged by spike non-differentiability and trade-offs among performance, efficiency, and biological plausibility. Crucially, mainstream SNNs ignore predictive coding, a core cortical mechanism where the brain predicts inputs and encodes errors for efficient perception. Inspired by this, we propose a self-prediction enhanced spiking neuron method that generates an internal prediction current from its input-output history to modulate membrane potential. This design offers dual advantages, it creates a continuous gradient path that alleviates vanishing gradients and boosts training stability and accuracy, while also aligning with biological principles, which resembles distal dendritic modulation and error-driven synaptic plasticity. Experiments show consistent performance gains across diverse architectures, neuron types, time steps, and tasks demonstrating broad applicability for enhancing SNNs.
Similar Papers
Spiking Neural Networks: The Future of Brain-Inspired Computing
Neural and Evolutionary Computing
Makes computers use less power to think.
On the Universal Representation Property of Spiking Neural Networks
Neural and Evolutionary Computing
Makes brain-like computers learn faster and use less power.
ChronoPlastic Spiking Neural Networks
Neural and Evolutionary Computing
Helps computers remember long-ago events better.