Ocular Authentication: Fusion of Gaze and Periocular Modalities
By: Dillon Lohr , Michael J. Proulx , Mehedi Hasan Raju and more
Potential Business Impact:
Unlocks your phone by looking at it.
This paper investigates the feasibility of fusing two eye-centric authentication modalities-eye movements and periocular images-within a calibration-free authentication system. While each modality has independently shown promise for user authentication, their combination within a unified gaze-estimation pipeline has not been thoroughly explored at scale. In this report, we propose a multimodal authentication system and evaluate it using a large-scale in-house dataset comprising 9202 subjects with an eye tracking (ET) signal quality equivalent to a consumer-facing virtual reality (VR) device. Our results show that the multimodal approach consistently outperforms both unimodal systems across all scenarios, surpassing the FIDO benchmark. The integration of a state-of-the-art machine learning architecture contributed significantly to the overall authentication performance at scale, driven by the model's ability to capture authentication representations and the complementary discriminative characteristics of the fused modalities.
Similar Papers
Gaze Authentication: Factors Influencing Authentication Performance
CV and Pattern Recognition
Makes eye logins more secure and reliable.
Evaluating the long-term viability of eye-tracking for continuous authentication in virtual reality
Cryptography and Security
Keeps VR games safe by watching how you look.
Mind Your Vision: Multimodal Estimation of Refractive Disorders Using Electrooculography and Eye Tracking
Image and Video Processing
Helps check eyesight by watching eye movements.