Extending Spike-Timing Dependent Plasticity to Learning Synaptic Delays
By: Marissa Dominijanni, Alexander Ororbia, Kenneth W. Regan
Potential Business Impact:
Teaches computer brains to learn faster.
Synaptic delays play a crucial role in biological neuronal networks, where their modulation has been observed in mammalian learning processes. In the realm of neuromorphic computing, although spiking neural networks (SNNs) aim to emulate biology more closely than traditional artificial neural networks do, synaptic delays are rarely incorporated into their simulation. We introduce a novel learning rule for simultaneously learning synaptic connection strengths and delays, by extending spike-timing dependent plasticity (STDP), a Hebbian method commonly used for learning synaptic weights. We validate our approach by extending a widely-used SNN model for classification trained with unsupervised learning. Then we demonstrate the effectiveness of our new method by comparing it against another existing methods for co-learning synaptic weights and delays as well as against STDP without synaptic delays. Results demonstrate that our proposed method consistently achieves superior performance across a variety of test scenarios. Furthermore, our experimental results yield insight into the interplay between synaptic efficacy and delay.
Similar Papers
Spatial Spiking Neural Networks Enable Efficient and Robust Temporal Computation
Neural and Evolutionary Computing
Makes smart computers learn faster with less memory.
Delays in Spiking Neural Networks: A State Space Model Approach
Machine Learning (CS)
Lets brain-like computers remember past events.
Synchrony-Gated Plasticity with Dopamine Modulation for Spiking Neural Networks
Neural and Evolutionary Computing
Makes AI learn better by mimicking brain signals.