Towards Context-Aware Human-like Pointing Gestures with RL Motion Imitation
By: Anna Deichler , Siyang Wang , Simon Alexanderson and more
Potential Business Impact:
Robots learn to point like humans.
Pointing is a key mode of interaction with robots, yet most prior work has focused on recognition rather than generation. We present a motion capture dataset of human pointing gestures covering diverse styles, handedness, and spatial targets. Using reinforcement learning with motion imitation, we train policies that reproduce human-like pointing while maximizing precision. Results show our approach enables context-aware pointing behaviors in simulation, balancing task performance with natural dynamics.
Similar Papers
Learning to Generate Pointing Gestures in Situated Embodied Conversational Agents
Robotics
Robots learn to point and talk naturally.
Deep Sensorimotor Control by Imitating Predictive Models of Human Motion
Robotics
Robots learn to move by watching humans.
Examining the legibility of humanoid robot arm movements in a pointing task
Robotics
Helps robots show where they're going.