See What I Mean? Mobile Eye-Perspective Rendering for Optical See-through Head-mounted Displays
By: Gerlinde Emsenhuber , Tobias Langlotz , Denis Kalkofen and more
Potential Business Impact:
Makes AR glasses show things correctly.
Image-based scene understanding allows Augmented Reality systems to provide contextual visual guidance in unprepared, real-world environments. While effective on video see-through (VST) head-mounted displays (HMDs), such methods suffer on optical see-through (OST) HMDs due to misregistration between the world-facing camera and the user's eye perspective. To approximate the user's true eye view, we implement and evaluate three software-based eye-perspective rendering (EPR) techniques on a commercially available, untethered OST HMD (Microsoft HoloLens 2): (1) Plane-Proxy EPR, projecting onto a fixed-distance plane; (2) Mesh-Proxy EPR, using SLAM-based reconstruction for projection; and (3) Gaze-Proxy EPR, a novel eye-tracking-based method that aligns the projection with the user's gaze depth. A user study on real-world tasks underscores the importance of accurate EPR and demonstrates gaze-proxy as a lightweight alternative to geometry-based methods. We release our EPR framework as open source.
Similar Papers
Impact of Target and Tool Visualization on Depth Perception and Usability in Optical See-Through AR
Human-Computer Interaction
Makes holograms clearer for surgery.
Positioning Monocular Optical See Through Head Worn Displays in Glasses for Everyday Wear
Human-Computer Interaction
Makes smart glasses less annoying to wear.
Real-Time Kinematic Positioning and Optical See-Through Head-Mounted Display for Outdoor Tracking: Hybrid System and Preliminary Assessment
Human-Computer Interaction
Shows you hidden things outside, precisely.