Supervised Spike Agreement Dependent Plasticity for Fast Local Learning in Spiking Neural Networks
By: Gouri Lakshmi S , Athira Chandrasekharan , Harshit Kumar and more
Spike-Timing-Dependent Plasticity (STDP) provides a biologically grounded learning rule for spiking neural networks (SNNs), but its reliance on precise spike timing and pairwise updates limits fast learning of weights. We introduce a supervised extension of Spike Agreement-Dependent Plasticity (SADP), which replaces pairwise spike-timing comparisons with population-level agreement metrics such as Cohen's kappa. The proposed learning rule preserves strict synaptic locality, admits linear-time complexity, and enables efficient supervised learning without backpropagation, surrogate gradients, or teacher forcing. We integrate supervised SADP within hybrid CNN-SNN architectures, where convolutional encoders provide compact feature representations that are converted into Poisson spike trains for agreement-driven learning in the SNN. Extensive experiments on MNIST, Fashion-MNIST, CIFAR-10, and biomedical image classification tasks demonstrate competitive performance and fast convergence. Additional analyses show stable performance across broad hyperparameter ranges and compatibility with device-inspired synaptic update dynamics. Together, these results establish supervised SADP as a scalable, biologically grounded, and hardware-aligned learning paradigm for spiking neural networks.
Similar Papers
Spike Agreement Dependent Plasticity: A scalable Bio-Inspired learning paradigm for Spiking Neural Networks
Neural and Evolutionary Computing
Makes brain-like computers learn faster and better.
Synchrony-Gated Plasticity with Dopamine Modulation for Spiking Neural Networks
Neural and Evolutionary Computing
Makes AI learn better by mimicking brain signals.
Learning with Spike Synchrony in Spiking Neural Networks
Neural and Evolutionary Computing
Teaches computers to learn like brains.