Sigma-Delta Neural Network Conversion on Loihi 2
By: Matthew Brehove , Sadia Anjum Tumpa , Espoir Kyubwa and more
Potential Business Impact:
Makes AI learn faster and use less power.
Neuromorphic computing aims to improve the efficiency of artificial neural networks by taking inspiration from biological neurons and leveraging temporal sparsity, spatial sparsity, and compute near/in memory. Although these approaches have shown efficiency gains, training these spiking neural networks (SNN) remains difficult. The original attempts at converting trained conventional analog neural networks (ANN) to SNNs used the rate of binary spikes to represent neuron activations. This required many simulation time steps per inference, which degraded efficiency. Intel's Loihi 2 is a neuromorphic platform that supports graded spikes which can be used to represent changes in neuron activation. In this work, we use Loihi 2's graded spikes to develop a method for converting ANN networks to spiking networks, which take advantage of temporal and spatial sparsity. We evaluated the performance of this network on Loihi 2 and compared it to NVIDIA's Jetson Xavier edge AI platform.
Similar Papers
A Complete Pipeline for deploying SNNs with Synaptic Delays on Loihi 2
Neural and Evolutionary Computing
Makes computers learn faster with less power.
Autonomous Reinforcement Learning Robot Control with Intel's Loihi 2 Neuromorphic Hardware
Robotics
Robots learn faster and use less power.
Spiking Neural Networks: The Future of Brain-Inspired Computing
Neural and Evolutionary Computing
Makes computers use less power to think.