WEBEYETRACK: Scalable Eye-Tracking for the Browser via On-Device Few-Shot Personalization
By: Eduardo Davalos , Yike Zhang , Namrata Srivastava and more
Potential Business Impact:
Tracks eyes accurately from your webcam.
With advancements in AI, new gaze estimation methods are exceeding state-of-the-art (SOTA) benchmarks, but their real-world application reveals a gap with commercial eye-tracking solutions. Factors like model size, inference time, and privacy often go unaddressed. Meanwhile, webcam-based eye-tracking methods lack sufficient accuracy, in particular due to head movement. To tackle these issues, we introduce We bEyeTrack, a framework that integrates lightweight SOTA gaze estimation models directly in the browser. It incorporates model-based head pose estimation and on-device few-shot learning with as few as nine calibration samples (k < 9). WebEyeTrack adapts to new users, achieving SOTA performance with an error margin of 2.32 cm on GazeCapture and real-time inference speeds of 2.4 milliseconds on an iPhone 14. Our open-source code is available at https://github.com/RedForestAi/WebEyeTrack.
Similar Papers
GazeTrack: High-Precision Eye Tracking Based on Regularization and Spatial Computing
CV and Pattern Recognition
Makes virtual reality eyes track more accurately.
Evaluating Sensitivity Parameters in Smartphone-Based Gaze Estimation: A Comparative Study of Appearance-Based and Infrared Eye Trackers
CV and Pattern Recognition
Lets phones track where you look.
FlatTrack: Eye-tracking with ultra-thin lensless cameras
Image and Video Processing
Makes VR headsets smaller and lighter.