Synaptic Pruning: A Biological Inspiration for Deep Learning Regularization
By: Gideon Vos , Liza van Eijk , Zoltan Sarnyai and more
Potential Business Impact:
Makes computer brains learn smarter and faster.
Synaptic pruning in biological brains removes weak connections to improve efficiency. In contrast, dropout regularization in artificial neural networks randomly deactivates neurons without considering activity-dependent pruning. We propose a magnitude-based synaptic pruning method that better reflects biology by progressively removing low-importance connections during training. Integrated directly into the training loop as a dropout replacement, our approach computes weight importance from absolute magnitudes across layers and applies a cubic schedule to gradually increase global sparsity. At fixed intervals, pruning masks permanently remove low-importance weights while maintaining gradient flow for active ones, eliminating the need for separate pruning and fine-tuning phases. Experiments on multiple time series forecasting models including RNN, LSTM, and Patch Time Series Transformer across four datasets show consistent gains. Our method ranked best overall, with statistically significant improvements confirmed by Friedman tests (p < 0.01). In financial forecasting, it reduced Mean Absolute Error by up to 20% over models with no or standard dropout, and up to 52% in select transformer models. This dynamic pruning mechanism advances regularization by coupling weight elimination with progressive sparsification, offering easy integration into diverse architectures. Its strong performance, especially in financial time series forecasting, highlights its potential as a practical alternative to conventional dropout techniques.
Similar Papers
A flexible framework for structural plasticity in GPU-accelerated sparse spiking neural networks
Neural and Evolutionary Computing
Makes computer brains learn faster and use less power.
Toward Efficient Spiking Transformers: Synapse Pruning Meets Synergistic Learning-Based Compensation
Machine Learning (CS)
Makes AI smarter, smaller, and faster.
Pruning Everything, Everywhere, All at Once
CV and Pattern Recognition
Makes smart computer programs smaller and faster.