Sleep-Based Homeostatic Regularization for Stabilizing Spike-Timing-Dependent Plasticity in Recurrent Spiking Neural Networks
By: Andreas Massey , Aliaksandr Hubin , Stefano Nichele and more
Spike-timing-dependent plasticity (STDP) provides a biologically-plausible learning mechanism for spiking neural networks (SNNs); however, Hebbian weight updates in architectures with recurrent connections suffer from pathological weight dynamics: unbounded growth, catastrophic forgetting, and loss of representational diversity. We propose a neuromorphic regularization scheme inspired by the synaptic homeostasis hypothesis: periodic offline phases during which external inputs are suppressed, synaptic weights undergo stochastic decay toward a homeostatic baseline, and spontaneous activity enables memory consolidation. We demonstrate that this sleep-wake cycle prevents weight saturation while preserving learned structure. Empirically, we find that low to intermediate sleep durations (10-20\% of training) improve stability on MNIST-like benchmarks in our STDP-SNN model, without any data-specific hyperparameter tuning. In contrast, the same sleep intervention yields no measurable benefit for the surrogate-gradient spiking neural network (SG-SNN). Taken together, these results suggest that periodic, sleep-based renormalization may represent a fundamental mechanism for stabilizing local Hebbian learning in neuromorphic systems, while also indicating that special care is required when integrating such protocols with existing gradient-based optimization methods.
Similar Papers
Learning with Spike Synchrony in Spiking Neural Networks
Neural and Evolutionary Computing
Teaches computers to learn like brains.
Efficient Memristive Spiking Neural Networks Architecture with Supervised In-Situ STDP Method
Emerging Technologies
Makes smart gadgets use way less power.
Synchrony-Gated Plasticity with Dopamine Modulation for Spiking Neural Networks
Neural and Evolutionary Computing
Makes AI learn better by mimicking brain signals.