MCGS-SLAM: A Multi-Camera SLAM Framework Using Gaussian Splatting for High-Fidelity Mapping
By: Zhihao Cao , Hanyu Wu , Li Wa Tang and more
Potential Business Impact:
Lets cameras map places better, even sides.
Recent progress in dense SLAM has primarily targeted monocular setups, often at the expense of robustness and geometric coverage. We present MCGS-SLAM, the first purely RGB-based multi-camera SLAM system built on 3D Gaussian Splatting (3DGS). Unlike prior methods relying on sparse maps or inertial data, MCGS-SLAM fuses dense RGB inputs from multiple viewpoints into a unified, continuously optimized Gaussian map. A multi-camera bundle adjustment (MCBA) jointly refines poses and depths via dense photometric and geometric residuals, while a scale consistency module enforces metric alignment across views using low-rank priors. The system supports RGB input and maintains real-time performance at large scale. Experiments on synthetic and real-world datasets show that MCGS-SLAM consistently yields accurate trajectories and photorealistic reconstructions, usually outperforming monocular baselines. Notably, the wide field of view from multi-camera input enables reconstruction of side-view regions that monocular setups miss, critical for safe autonomous operation. These results highlight the promise of multi-camera Gaussian Splatting SLAM for high-fidelity mapping in robotics and autonomous driving.
Similar Papers
EGS-SLAM: RGB-D Gaussian Splatting SLAM with Events
Robotics
Makes 3D pictures clear even when moving fast.
Unposed 3DGS Reconstruction with Probabilistic Procrustes Mapping
CV and Pattern Recognition
Creates detailed 3D worlds from many photos.
A Survey on Collaborative SLAM with 3D Gaussian Splatting
Robotics
Robots map places together faster and better.