Sub-Millisecond Event-Based Eye Tracking on a Resource-Constrained Microcontroller
By: Marco Giordano , Pietro Bonazzi , Luca Benini and more
Potential Business Impact:
Tracks eyes super fast with little power.
This paper presents a novel event-based eye-tracking system deployed on a resource-constrained microcontroller, addressing the challenges of real-time, low-latency, and low-power performance in embedded systems. The system leverages a Dynamic Vision Sensor (DVS), specifically the DVXplorer Micro, with an average temporal resolution of 200 {\mu}s, to capture rapid eye movements with extremely low latency. The system is implemented on a novel low-power and high-performance microcontroller from STMicroelectronics, the STM32N6. The microcontroller features an 800 MHz Arm Cortex-M55 core and AI hardware accelerator, the Neural-ART Accelerator, enabling real-time inference with milliwatt power consumption. The paper propose a hardware-aware and sensor-aware compact Convolutional Neuron Network (CNN) optimized for event-based data, deployed at the edge, achieving a mean pupil prediction error of 5.99 pixels and a median error of 5.73 pixels on the Ini-30 dataset. The system achieves an end-to-end inference latency of just 385 {\mu}s and a neural network throughput of 52 Multiply and Accumulate (MAC) operations per cycle while consuming just 155 {\mu}J of energy. This approach allows for the development of a fully embedded, energy-efficient eye-tracking solution suitable for applications such as smart glasses and wearable devices.
Similar Papers
JaneEye: A 12-nm 2K-FPS 18.9-$μ$J/Frame Event-based Eye Tracking Accelerator
Signal Processing
Tracks eyes fast and uses little power.
JaneEye: A 12-nm 2K-FPS 18.9-$μ$J/Frame Event-based Eye Tracking Accelerator
Signal Processing
Makes VR headsets track eyes super fast and cheap.
Neuromorphic Eye Tracking for Low-Latency Pupil Detection
CV and Pattern Recognition
Makes VR/AR glasses track eyes super fast, low power.