From Generated Human Videos to Physically Plausible Robot Trajectories
By: James Ni , Zekai Wang , Wei Lin and more
Potential Business Impact:
Robots copy human moves from fake videos.
Video generation models are rapidly improving in their ability to synthesize human actions in novel contexts, holding the potential to serve as high-level planners for contextual robot control. To realize this potential, a key research question remains open: how can a humanoid execute the human actions from generated videos in a zero-shot manner? This challenge arises because generated videos are often noisy and exhibit morphological distortions that make direct imitation difficult compared to real video. To address this, we introduce a two-stage pipeline. First, we lift video pixels into a 4D human representation and then retarget to the humanoid morphology. Second, we propose GenMimic-a physics-aware reinforcement learning policy conditioned on 3D keypoints, and trained with symmetry regularization and keypoint-weighted tracking rewards. As a result, GenMimic can mimic human actions from noisy, generated videos. We curate GenMimicBench, a synthetic human-motion dataset generated using two video generation models across a spectrum of actions and contexts, establishing a benchmark for assessing zero-shot generalization and policy robustness. Extensive experiments demonstrate improvements over strong baselines in simulation and confirm coherent, physically stable motion tracking on a Unitree G1 humanoid robot without fine-tuning. This work offers a promising path to realizing the potential of video generation models as high-level policies for robot control.
Similar Papers
X-Humanoid: Robotize Human Videos to Generate Humanoid Videos at Scale
CV and Pattern Recognition
Turns human videos into robot training videos.
VidBot: Learning Generalizable 3D Actions from In-the-Wild 2D Human Videos for Zero-Shot Robotic Manipulation
Robotics
Teaches robots to do tasks from watching videos.
Opening the Sim-to-Real Door for Humanoid Pixel-to-Action Policy Transfer
Robotics
Robots learn to open doors just by watching.