A Virtual Mechanical Interaction Layer Enables Resilient Human-to-Robot Object Handovers
By: Omar Faris, Sławomir Tadeja, Fulvio Forni
Potential Business Impact:
Robot learns to catch objects from people better.
Object handover is a common form of interaction that is widely present in collaborative tasks. However, achieving it efficiently remains a challenge. We address the problem of ensuring resilient robotic actions that can adapt to complex changes in object pose during human-to-robot object handovers. We propose the use of Virtual Model Control to create an interaction layer that controls the robot and adapts to the dynamic changes in the handover process. Additionally, we propose the use of augmented reality to facilitate bidirectional communication between humans and robots during handovers. We assess the performance of our controller in a set of experiments that demonstrate its resilience to various sources of uncertainties, including complex changes to the object's pose during the handover. Finally, we performed a user study with 16 participants to understand human preferences for different robot control profiles and augmented reality visuals in object handovers. Our results showed a general preference for the proposed approach and revealed insights that can guide further development in adapting the interaction with the user.
Similar Papers
How Robot Kinematics Influence Human Performance in Virtual Robot-to-Human Handover Tasks
Robotics
Robots move smoothly to help people work together.
Multimodal Human-Intent Modeling for Contextual Robot-to-Human Handovers of Arbitrary Objects
Robotics
Robots learn to hand you things you want.
Modeling Dynamic Hand-Object Interactions with Applications to Human-Robot Handovers
Robotics
Robots learn to move and hand objects like people.