Exploring the Feasibility of Gaze-Based Navigation Across Path Types
By: Yichuan Zhang , Liangyuting Zhang , Xuning Hu and more
Potential Business Impact:
Lets you control virtual worlds by looking.
Gaze input, as a modality inherently conveying user intent, offers intuitive and immersive experiences in extended reality (XR). With eye-tracking now being a standard feature in modern XR headsets, gaze has been extensively applied to tasks such as selection, text entry, and object manipulation. However, gaze based navigation despite being a fundamental interaction task remains largely underexplored. In particular, little is known about which path types are well suited for gaze navigation and under what conditions it performs effectively. To bridge this gap, we conducted a controlled user study evaluating gaze-based navigation across three representative path types: linear, narrowing, and circular. Our findings reveal distinct performance characteristics and parameter ranges for each path type, offering design insights and practical guidelines for future gaze-driven navigation systems in XR.
Similar Papers
Gaze-Hand Steering for Travel and Multitasking in Virtual Environments
Human-Computer Interaction
Lets you control virtual worlds with eyes and hands.
GazeBlend: Exploring Paired Gaze-Based Input Techniques for Navigation and Selection Tasks on Mobile Devices
Human-Computer Interaction
Lets you control phones with your eyes better.
Multimodal Perception for Goal-oriented Navigation: A Survey
Robotics
Helps robots learn to find their way around.