Score: 2

MCGS-SLAM: A Multi-Camera SLAM Framework Using Gaussian Splatting for High-Fidelity Mapping

Published: September 17, 2025 | arXiv ID: 2509.14191v1

By: Zhihao Cao , Hanyu Wu , Li Wa Tang and more

BigTech Affiliations: Microsoft

Potential Business Impact:

Lets cameras map places better, even sides.

Business Areas:
GPS Hardware, Navigation and Mapping

Recent progress in dense SLAM has primarily targeted monocular setups, often at the expense of robustness and geometric coverage. We present MCGS-SLAM, the first purely RGB-based multi-camera SLAM system built on 3D Gaussian Splatting (3DGS). Unlike prior methods relying on sparse maps or inertial data, MCGS-SLAM fuses dense RGB inputs from multiple viewpoints into a unified, continuously optimized Gaussian map. A multi-camera bundle adjustment (MCBA) jointly refines poses and depths via dense photometric and geometric residuals, while a scale consistency module enforces metric alignment across views using low-rank priors. The system supports RGB input and maintains real-time performance at large scale. Experiments on synthetic and real-world datasets show that MCGS-SLAM consistently yields accurate trajectories and photorealistic reconstructions, usually outperforming monocular baselines. Notably, the wide field of view from multi-camera input enables reconstruction of side-view regions that monocular setups miss, critical for safe autonomous operation. These results highlight the promise of multi-camera Gaussian Splatting SLAM for high-fidelity mapping in robotics and autonomous driving.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡©πŸ‡ͺ πŸ‡³πŸ‡± πŸ‡¨πŸ‡­ Germany, United States, Switzerland, Netherlands

Page Count
9 pages

Category
Computer Science:
Robotics