Spike-timing-dependent Hebbian learning as noisy gradient descent
By: Niklas Dexheimer, Sascha Gaudlitz, Johannes Schmidt-Hieber
Potential Business Impact:
Makes brain connections stronger when neurons fire together.
Hebbian learning is a key principle underlying learning in biological neural networks. It postulates that synaptic changes occur locally, depending on the activities of pre- and postsynaptic neurons. While Hebbian learning based on neuronal firing rates is well explored, much less is known about learning rules that account for precise spike-timing. We relate a Hebbian spike-timing-dependent plasticity rule to noisy gradient descent with respect to a natural loss function on the probability simplex. This connection allows us to prove that the learning rule eventually identifies the presynaptic neuron with the highest activity. We also discover an intrinsic connection to noisy mirror descent.
Similar Papers
Learning in Spiking Neural Networks with a Calcium-based Hebbian Rule for Spike-timing-dependent Plasticity
Neural and Evolutionary Computing
Teaches computers to learn like brains do.
Weight transport through spike timing for robust local gradients
Neurons and Cognition
Helps brain-like computers learn faster and better.
Extending Spike-Timing Dependent Plasticity to Learning Synaptic Delays
Neural and Evolutionary Computing
Teaches computer brains to learn faster.