Can Biologically Plausible Temporal Credit Assignment Rules Match BPTT for Neural Similarity? E-prop as an Example
By: Yuhan Helena Liu, Guangyu Robert Yang, Christopher J. Cueva
Potential Business Impact:
Brain learning rules match brain activity.
Understanding how the brain learns may be informed by studying biologically plausible learning rules. These rules, often approximating gradient descent learning to respect biological constraints such as locality, must meet two critical criteria to be considered an appropriate brain model: (1) good neuroscience task performance and (2) alignment with neural recordings. While extensive research has assessed the first criterion, the second remains underexamined. Employing methods such as Procrustes analysis on well-known neuroscience datasets, this study demonstrates the existence of a biologically plausible learning rule -- namely e-prop, which is based on gradient truncation and has demonstrated versatility across a wide range of tasks -- that can achieve neural data similarity comparable to Backpropagation Through Time (BPTT) when matched for task accuracy. Our findings also reveal that model architecture and initial conditions can play a more significant role in determining neural similarity than the specific learning rule. Furthermore, we observe that BPTT-trained models and their biologically plausible counterparts exhibit similar dynamical properties at comparable accuracies. These results underscore the substantial progress made in developing biologically plausible learning rules, highlighting their potential to achieve both competitive task performance and neural data similarity.
Similar Papers
Traces Propagation: Memory-Efficient and Scalable Forward-Only Learning in Spiking Neural Networks
Machine Learning (CS)
Teaches computer brains to learn like real brains.
Traces Propagation: Memory-Efficient and Scalable Forward-Only Learning in Spiking Neural Networks
Machine Learning (CS)
Teaches computers to learn like brains, using less power.
Event-driven eligibility propagation in large sparse networks: efficiency shaped by biological realism
Neural and Evolutionary Computing
Makes AI learn like brains, using less energy.