GI-SLAM: Gaussian-Inertial SLAM
By: Xulang Liu, Ning Tan
Potential Business Impact:
Makes robots see better with motion and cameras.
3D Gaussian Splatting (3DGS) has recently emerged as a powerful representation of geometry and appearance for dense Simultaneous Localization and Mapping (SLAM). Through rapid, differentiable rasterization of 3D Gaussians, many 3DGS SLAM methods achieve near real-time rendering and accelerated training. However, these methods largely overlook inertial data, witch is a critical piece of information collected from the inertial measurement unit (IMU). In this paper, we present GI-SLAM, a novel gaussian-inertial SLAM system which consists of an IMU-enhanced camera tracking module and a realistic 3D Gaussian-based scene representation for mapping. Our method introduces an IMU loss that seamlessly integrates into the deep learning framework underpinning 3D Gaussian Splatting SLAM, effectively enhancing the accuracy, robustness and efficiency of camera tracking. Moreover, our SLAM system supports a wide range of sensor configurations, including monocular, stereo, and RGBD cameras, both with and without IMU integration. Our method achieves competitive performance compared with existing state-of-the-art real-time methods on the EuRoC and TUM-RGBD datasets.
Similar Papers
VIGS-SLAM: Visual Inertial Gaussian Splatting SLAM
Robotics
Makes robots see clearly in tricky places.
GS4: Generalizable Sparse Splatting Semantic SLAM
CV and Pattern Recognition
Builds detailed 3D maps from videos quickly.
EGS-SLAM: RGB-D Gaussian Splatting SLAM with Events
Robotics
Makes 3D pictures clear even when moving fast.