Grasp Prediction based on Local Finger Motion Dynamics
By: Dimitar Valkov , Pascal Kockwelp , Florian Daiber and more
Potential Business Impact:
Predicts what you'll grab before you touch it.
The ability to predict the object the user intends to grasp offers essential contextual information and may help to leverage the effects of point-to-point latency in interactive environments. This paper explores the feasibility and accuracy of real-time recognition of uninstrumented objects based on hand kinematics during reach-to-grasp actions. In a data collection study, we recorded the hand motions of 16 participants while reaching out to grasp and then moving real and synthetic objects. Our results demonstrate that even a simple LSTM network can predict the time point at which the user grasps an object with a precision better than 21 ms and the current distance to this object with a precision better than 1 cm. The target's size can be determined in advance with an accuracy better than 97%. Our results have implications for designing adaptive and fine-grained interactive user interfaces in ubiquitous and mixed-reality environments.
Similar Papers
Predicting User Grasp Intentions in Virtual Reality
Human-Computer Interaction
Makes VR hands move like real hands.
Gaze-Guided 3D Hand Motion Prediction for Detecting Intent in Egocentric Grasping Tasks
CV and Pattern Recognition
Helps robots guess what hand you'll move next.
Grasp-HGN: Grasping the Unexpected
Robotics
Robotic hands better grab new things.