Investigating Encoding and Perspective for Augmented Reality
By: Jade Kandel , Sriya Kasumarthi , Spiros Tsalikis and more
Potential Business Impact:
Helps AR guide any body movement, not just arms.
Augmented reality (AR) offers promising opportunities to support movement-based activities, such as personal training or physical therapy, with real-time, spatially-situated visual cues. While many approaches leverage AR to guide motion, existing design guidelines focus on simple, upper-body movements within the user's field of view. We lack evidence-based design recommendations for guiding more diverse scenarios involving movements with varying levels of visibility and direction. We conducted an experiment to investigate how different visual encodings and perspectives affect motion guidance performance and usability, using three exercises that varied in visibility and planes of motion. Our findings reveal significant differences in preference and performance across designs. Notably, the best perspective varied depending on motion visibility and showing more information about the overall motion did not necessarily improve motion execution. We provide empirically-grounded guidelines for designing immersive, interactive visualizations for motion guidance to support more effective AR systems.
Similar Papers
AR as an Evaluation Playground: Bridging Metrics and Visual Perception of Computer Vision Models
CV and Pattern Recognition
Lets people test computer vision with games.
On the Go with AR: Attention to Virtual and Physical Targets while Varying Augmentation Density
Human-Computer Interaction
AR glasses help you find things without getting lost.
Enhancing User Performance and Human Factors through Visual Guidance in AR Assembly Tasks
Human-Computer Interaction
Makes augmented reality tasks faster, but with more mistakes.