BeyondMimic: From Motion Tracking to Versatile Humanoid Control via Guided Diffusion
By: Qiayuan Liao , Takara E. Truong , Xiaoyu Huang and more
Potential Business Impact:
Robots learn to copy and do human-like moves.
Learning skills from human motions offers a promising path toward generalizable policies for versatile humanoid whole-body control, yet two key cornerstones are missing: (1) a high-quality motion tracking framework that faithfully transforms large-scale kinematic references into robust and extremely dynamic motions on real hardware, and (2) a distillation approach that can effectively learn these motion primitives and compose them to solve downstream tasks. We address these gaps with BeyondMimic, a real-world framework to learn from human motions for versatile and naturalistic humanoid control via guided diffusion. Our framework provides a motion tracking pipeline capable of challenging skills such as jumping spins, sprinting, and cartwheels with state-of-the-art motion quality. Moving beyond simply mimicking existing motions, we further introduce a unified diffusion policy that enables zero-shot task-specific control at test time using simple cost functions. Deployed on hardware, BeyondMimic performs diverse tasks at test time, including waypoint navigation, joystick teleoperation, and obstacle avoidance, bridging sim-to-real motion tracking and flexible synthesis of human motion primitives for whole-body control. https://beyondmimic.github.io/.
Similar Papers
BeyondMimic: From Motion Tracking to Versatile Humanoid Control via Guided Diffusion
Robotics
Robots learn to copy and create human movements.
BeyondMimic: From Motion Tracking to Versatile Humanoid Control via Guided Diffusion
Robotics
Robots learn to copy human moves for new tasks.
Towards Adaptable Humanoid Control via Adaptive Motion Tracking
Robotics
Robots copy human moves from one example.