I Like To Move It -- Computation Instead of Data in the Brain
By: Fabian Czappa, Marvin Kaster, Felix Wolf
Potential Business Impact:
Makes brain simulations run six times faster.
The detailed functioning of the human brain is still poorly understood. Brain simulations are a well-established way to complement experimental research, but must contend with the computational demands of the approximately $10^{11}$ neurons and the $10^{14}$ synapses connecting them, the network of the latter referred to as the connectome. Studies suggest that changes in the connectome (i.e., the formation and deletion of synapses, also known as structural plasticity) are essential for critical tasks such as memory formation and learning. The connectivity update can be efficiently computed using a Barnes-Hut-inspired approximation that lowers the computational complexity from $O(n^2)$ to $O(n log n)$, where n is the number of neurons. However, updating synapses, which relies heavily on RMA, and the spike exchange between neurons, which requires all-to-all communication at every time step, still hinder scalability. We present a new algorithm that significantly reduces the communication overhead by moving computation instead of data. This shrinks the time it takes to update connectivity by a factor of six and the time it takes to exchange spikes by more than two orders of magnitude.
Similar Papers
Adaptive Synaptogenesis Implemented on a Nanomagnetic Platform
Disordered Systems and Neural Networks
Brain learning trick helps computers remember everything.
Data-Efficient Neural Training with Dynamic Connectomes
Neurons and Cognition
Shows how computer brains learn by watching their activity.
Connectome-Guided Automatic Learning Rates for Deep Networks
Neural and Evolutionary Computing
Teaches computers to learn like brains.