Implicit Kinodynamic Motion Retargeting for Human-to-humanoid Imitation Learning
By: Xingyu Chen , Hanyu Wu , Sikai Wu and more
Potential Business Impact:
Robots learn to copy human movements perfectly.
Human-to-humanoid imitation learning aims to learn a humanoid whole-body controller from human motion. Motion retargeting is a crucial step in enabling robots to acquire reference trajectories when exploring locomotion skills. However, current methods focus on motion retargeting frame by frame, which lacks scalability. Could we directly convert large-scale human motion into robot-executable motion through a more efficient approach? To address this issue, we propose Implicit Kinodynamic Motion Retargeting (IKMR), a novel efficient and scalable retargeting framework that considers both kinematics and dynamics. In kinematics, IKMR pretrains motion topology feature representation and a dual encoder-decoder architecture to learn a motion domain mapping. In dynamics, IKMR integrates imitation learning with the motion retargeting network to refine motion into physically feasible trajectories. After fine-tuning using the tracking results, IKMR can achieve large-scale physically feasible motion retargeting in real time, and a whole-body controller could be directly trained and deployed for tracking its retargeted trajectories. We conduct our experiments both in the simulator and the real robot on a full-size humanoid robot. Extensive experiments and evaluation results verify the effectiveness of our proposed framework.
Similar Papers
Retargeting Matters: General Motion Retargeting for Humanoid Motion Tracking
Robotics
Makes robot movements look more like human movements.
A Whole-Body Motion Imitation Framework from Human Data for Full-Size Humanoid Robot
Robotics
Robots copy human moves, staying balanced.
OmniRetarget: Interaction-Preserving Data Generation for Humanoid Whole-Body Loco-Manipulation and Scene Interaction
Robotics
Robots learn parkour and object skills from human moves.