Unveiling the Impact of Data and Model Scaling on High-Level Control for Humanoid Robots
By: Yuxi Wei , Zirui Wang , Kangning Yin and more
Potential Business Impact:
Teaches robots to move like humans from videos.
Data scaling has long remained a critical bottleneck in robot learning. For humanoid robots, human videos and motion data are abundant and widely available, offering a free and large-scale data source. Besides, the semantics related to the motions enable modality alignment and high-level robot control learning. However, how to effectively mine raw video, extract robot-learnable representations, and leverage them for scalable learning remains an open problem. To address this, we introduce Humanoid-Union, a large-scale dataset generated through an autonomous pipeline, comprising over 260 hours of diverse, high-quality humanoid robot motion data with semantic annotations derived from human motion videos. The dataset can be further expanded via the same pipeline. Building on this data resource, we propose SCHUR, a scalable learning framework designed to explore the impact of large-scale data on high-level control in humanoid robots. Experimental results demonstrate that SCHUR achieves high robot motion generation quality and strong text-motion alignment under data and model scaling, with 37\% reconstruction improvement under MPJPE and 25\% alignment improvement under FID comparing with previous methods. Its effectiveness is further validated through deployment in real-world humanoid robot.
Similar Papers
SONIC: Supersizing Motion Tracking for Natural Humanoid Whole-Body Control
Robotics
Makes robots move like humans naturally.
X-Humanoid: Robotize Human Videos to Generate Humanoid Videos at Scale
CV and Pattern Recognition
Turns human videos into robot training videos.
HumanoidExo: Scalable Whole-Body Humanoid Manipulation via Wearable Exoskeleton
Robotics
Teaches robots to move like humans faster.