Live Demonstration: Neuromorphic Radar for Gesture Recognition
By: Satyapreet Singh Yadav , Akash K S , Chandra Sekhar Seelamantula and more
Potential Business Impact:
Lets computers understand hand waves using tiny radar.
We present a neuromorphic radar framework for real-time, low-power hand gesture recognition (HGR) using an event-driven architecture inspired by biological sensing. Our system comprises a 24 GHz Doppler radar front-end and a custom neuromorphic sampler that converts intermediate-frequency (IF) signals into sparse spike-based representations via asynchronous sigma-delta encoding. These events are directly processed by a lightweight neural network deployed on a Cortex-M0 microcontroller, enabling low-latency inference without requiring spectrogram reconstruction. Unlike conventional radar HGR pipelines that continuously sample and process data, our architecture activates only when meaningful motion is detected, significantly reducing memory, power, and computation overhead. Evaluated on a dataset of five gestures collected from seven users, our system achieves > 85% real-time accuracy. To the best of our knowledge, this is the first work that employs bio-inspired asynchronous sigma-delta encoding and an event-driven processing framework for radar-based HGR.
Similar Papers
Live Demonstration: Neuromorphic Radar for Gesture Recognition
CV and Pattern Recognition
Recognizes hand waves using tiny, smart radar.
Advancing Radar Hand Gesture Recognition: A Hybrid Spectrum Synthetic Framework Merging Simulation with Neural Networks
Human-Computer Interaction
Lets computers understand hand waves better.
Spatiotemporal Radar Gesture Recognition with Hybrid Spiking Neural Networks: Balancing Accuracy and Efficiency
Neural and Evolutionary Computing
Saves energy for radar that sees people.