HandOver: Enabling Precise Selection & Manipulation of 3D Objects with Mouse and Hand Tracking
By: Esen K. Tütüncü, Mar Gonzalez-Franco, Eric J. Gonzalez
Potential Business Impact:
Lets you control 3D objects better with mouse and hands.
We present HandOver, an extended reality (XR) interaction technique designed to unify the precision of traditional mouse input for object selection with the expressiveness of hand-tracking for object manipulation. With HandOver, the mouse is used to drive a depth-aware 3D cursor enabling precise and restful targeting -by hovering their hand over the mouse, the user can then seamlessly transition into direct 3D manipulation of the target object. In a formal user study, we compare HandOver against two raybased techniques: traditional raycasting (Ray) and a hybrid method (Ray+Hand) in a 3D docking task. Results show HandOver yields lower task errors across all distances, and moreover improves interaction ergonomics as highlighted by a RULA posture analysis and self-reported measures (NASA-TLX). These findings illustrate the benefits of blending traditional precise input devices with the expressive gestural inputs afforded by hand-tracking in XR, leading to improved user comfort and task performance. This blended paradigm yields a unified workflow allowing users to leverage the best of each input modality as they interact in immersive environments.
Similar Papers
Multimodal Human-Intent Modeling for Contextual Robot-to-Human Handovers of Arbitrary Objects
Robotics
Robots learn to hand you things you want.
A Virtual Mechanical Interaction Layer Enables Resilient Human-to-Robot Object Handovers
Robotics
Robot learns to catch objects from people better.
Collaborative Object Handover in a Robot Crafting Assistant
Robotics
Robots learn to hand things to people smoothly.