Score: 2

MiMo-Embodied: X-Embodied Foundation Model Technical Report

Published: November 20, 2025 | arXiv ID: 2511.16518v1

By: Xiaoshuai Hao , Lei Zhou , Zhijian Huang and more

Potential Business Impact:

Teaches robots and cars to learn together.

Business Areas:
Robotics Hardware, Science and Engineering, Software

We open-source MiMo-Embodied, the first cross-embodied foundation model to successfully integrate and achieve state-of-the-art performance in both Autonomous Driving and Embodied AI. MiMo-Embodied sets new records across 17 embodied AI benchmarks in Task Planning, Affordance Prediction and Spatial Understanding, while also excelling in 12 autonomous driving benchmarks across Environmental Perception, Status Prediction, and Driving Planning. Across these tasks, MiMo-Embodied significantly outperforms existing open-source, closed-source, and specialized baselines. Our results indicate that through multi-stage learning, curated data construction, and CoT/RL fine-tuning, these two domains exhibit strong positive transfer and mutually reinforce one another. We provide a detailed analysis of our model design and training methodologies to facilitate further research. Code and models are available at https://github.com/XiaomiMiMo/MiMo-Embodied.

Repos / Data Links

Page Count
68 pages

Category
Computer Science:
Robotics