Mind the Gaze: Improving the Usability of Dwell Input by Adapting Gaze Targets Based on Viewing Distance
By: Omar Namnakani , Yasmeen Abdrabou , Cristina Fiani and more
Dwell input shows promise for handheld mobile contexts, but its performance is impacted by target size and viewing distance. While fixed target sizes suffice in static setups, in mobile settings, frequent posture changes alter viewing distances, which in turn distort perceived size and hinder dwell performance. We address this through GAUI, a Gaze-based Adaptive User Interface that dynamically resizes targets to maximise performance at the given viewing distance. In a two-phased study (N=24), GAUI leveraged the strengths of its distance-responsive design, outperforming the large UI static baseline in task time, and being less error-prone than the small UI static baseline. It was rated the most preferred interface overall. Participants reflected on using GAUI in six different postures. We discuss how their experience is impacted by posture, and propose guidelines for designing context-aware adaptive UIs for dwell interfaces on handheld mobile devices that maximise performance.
Similar Papers
GazeBlend: Exploring Paired Gaze-Based Input Techniques for Navigation and Selection Tasks on Mobile Devices
Human-Computer Interaction
Lets you control phones with your eyes better.
Interactions par franchissement grâce a un système de suivi du regard
Human-Computer Interaction
Lets you control computers faster with your eyes.
Viewpoint-Tolerant Depth Perception for Shared Extended Space Experience on Wall-Sized Display
Human-Computer Interaction
Makes big screens show 3D for everyone, no glasses.