VIGS-SLAM: Visual Inertial Gaussian Splatting SLAM
By: Zihan Zhu , Wei Zhang , Norbert Haala and more
Potential Business Impact:
Makes robots see clearly in tricky places.
We present VIGS-SLAM, a visual-inertial 3D Gaussian Splatting SLAM system that achieves robust real-time tracking and high-fidelity reconstruction. Although recent 3DGS-based SLAM methods achieve dense and photorealistic mapping, their purely visual design degrades under motion blur, low texture, and exposure variations. Our method tightly couples visual and inertial cues within a unified optimization framework, jointly refining camera poses, depths, and IMU states. It features robust IMU initialization, time-varying bias modeling, and loop closure with consistent Gaussian updates. Experiments on four challenging datasets demonstrate our superiority over state-of-the-art methods. Project page: https://vigs-slam.github.io
Similar Papers
GI-SLAM: Gaussian-Inertial SLAM
Robotics
Makes robots see better with motion and cameras.
GeVI-SLAM: Gravity-Enhanced Stereo Visua Inertial SLAM for Underwater Robots
Robotics
Helps underwater robots see and move precisely.
RoGER-SLAM: A Robust Gaussian Splatting SLAM System for Noisy and Low-light Environment Resilience
Robotics
Helps robots see and map in dark, noisy places.