Deep Learning-Based Visual Fatigue Detection Using Eye Gaze Patterns in VR
By: Numan Zafar, Johnathan Locke, Shafique Ahmad Chaudhry
Potential Business Impact:
Spots tired eyes in VR games.
Prolonged exposure to virtual reality (VR) systems leads to visual fatigue, impairs user comfort, performance, and safety, particularly in high-stakes or long-duration applications. Existing fatigue detection approaches rely on subjective questionnaires or intrusive physiological signals, such as EEG, heart rate, or eye-blink count, which limit their scalability and real-time applicability. This paper introduces a deep learning-based study for detecting visual fatigue using continuous eye-gaze trajectories recorded in VR. We use the GazeBaseVR dataset comprising binocular eye-tracking data from 407 participants across five immersive tasks, extract cyclopean eye-gaze angles, and evaluate six deep classifiers. Our results demonstrate that EKYT achieves up to 94% accuracy, particularly in tasks demanding high visual attention, such as video viewing and text reading. We further analyze gaze variance and subjective fatigue measures, indicating significant behavioral differences between fatigued and non-fatigued conditions. These findings establish eye-gaze dynamics as a reliable and nonintrusive modality for continuous fatigue detection in immersive VR, offering practical implications for adaptive human-computer interactions.
Similar Papers
Fatigue-Aware Adaptive Interfaces for Wearable Devices Using Deep Learning
Machine Learning (CS)
Smartwatches get smarter, reducing tiredness.
NeuroGaze: A Hybrid EEG and Eye-Tracking Brain-Computer Interface for Hands-Free Interaction in Virtual Reality
Human-Computer Interaction
Control virtual worlds with your eyes and brain.
Towards Intelligent VR Training: A Physiological Adaptation Framework for Cognitive Load and Stress Detection
Human-Computer Interaction
Makes VR training harder or easier automatically.