Score: 0

Effects of Introducing Synaptic Scaling on Spiking Neural Network Learning

Published: January 16, 2026 | arXiv ID: 2601.11261v1

By: Shinnosuke Touda, Hirotsugu Okuno

Potential Business Impact:

Teaches computers to learn faster from examples.

Business Areas:
Neuroscience Biotechnology, Science and Engineering

Spiking neural networks (SNNs) employing unsupervised learning methods inspired by neural plasticity are expected to be a new framework for artificial intelligence. In this study, we investigated the effect of multiple types of neural plasticity, such as spike-time-dependent plasticity (STDP) and synaptic scaling, on the learning in a winner-take-all (WTA) network composed of spiking neurons. We implemented a WTA network with multiple types of neural plasticity using Python. The MNIST and the Fashion-MNIST datasets were used for training and testing. We varied the number of neurons, the time constant of STDP, and the normalization method used in synaptic scaling to compare classification accuracy. The results demonstrated that synaptic scaling based on the L2 norm was the most effective in improving classification performance. By implementing L2-norm-based synaptic scaling and setting the number of neurons in both excitatory and inhibitory layers to 400, the network achieved classification accuracies of 88.84 % on the MNIST dataset and 68.01 % on the Fashion-MNIST dataset after one epoch of training.

Country of Origin
πŸ‡―πŸ‡΅ Japan

Page Count
6 pages

Category
Computer Science:
Neural and Evolutionary Computing