Small Cues, Big Differences: Evaluating Interaction and Presentation for Annotation Retrieval in AR
By: Zahra Borhani, Ali Ebrahimpour-Boroojeny, Francisco R. Ortega
Potential Business Impact:
Finds virtual notes in the real world faster.
Augmented Reality (AR) enables intuitive interaction with virtual annotations overlaid on the real world, supporting a wide range of applications such as remote assistance, education, and industrial training. However, as the number of heterogeneous annotations increases, their efficient retrieval remains an open challenge in 3D environments. This paper examines how interaction modalities and presentation designs affect user performance, workload, fatigue, and preference in AR annotation retrieval. In two user studies, we compare eye-gaze versus hand-ray hovering and evaluate four presentation methods: Opacity-based, Scale-based, Nothing-based, and Marker-based. Results show that eye-gaze was favored over hand-ray by users, despite leading to significantly higher unintentional activations. Among the presentation methods, Scale-based presentation reduces workload and task completion time while aligning with user preferences. Our findings offer empirical insights into the effectiveness of different annotation presentation methods, leading to design recommendations for building more efficient and user-friendly AR annotation review systems.
Similar Papers
Examining the Effects of Immersive and Non-Immersive Presenter Modalities on Engagement and Social Interaction in Co-located Augmented Presentations
Human-Computer Interaction
Makes AR presentations more social and engaging.
Exploring the Effect of Viewing Attributes of Mobile AR Interfaces on Remote Collaborative and Competitive Tasks
Human-Computer Interaction
Lets people work together on tasks from far away.
Scene Awareness While Using Multiple Navigation Aids in AR Search
Human-Computer Interaction
AR navigation tools can hurt your memory.