Examining the legibility of humanoid robot arm movements in a pointing task
By: Andrej Lúčny , Matilde Antonj , Carlo Mazzola and more
Potential Business Impact:
Helps robots show where they're going.
Human--robot interaction requires robots whose actions are legible, allowing humans to interpret, predict, and feel safe around them. This study investigates the legibility of humanoid robot arm movements in a pointing task, aiming to understand how humans predict robot intentions from truncated movements and bodily cues. We designed an experiment using the NICO humanoid robot, where participants observed its arm movements towards targets on a touchscreen. Robot cues varied across conditions: gaze, pointing, and pointing with congruent or incongruent gaze. Arm trajectories were stopped at 60\% or 80\% of their full length, and participants predicted the final target. We tested the multimodal superiority and ocular primacy hypotheses, both of which were supported by the experiment.
Similar Papers
Unconscious and Intentional Human Motion Cues for Expressive Robot-Arm Motion Design
Robotics
Robots move like people to show feelings.
Signaling Human Intentions to Service Robots: Understanding the Use of Social Cues during In-Person Conversations
Human-Computer Interaction
Robots understand when you want to talk.
Learning to Generate Pointing Gestures in Situated Embodied Conversational Agents
Robotics
Robots learn to point and talk naturally.