EGS-SLAM: RGB-D Gaussian Splatting SLAM with Events
By: Siyu Chen , Shenghai Yuan , Thien-Minh Nguyen and more
Potential Business Impact:
Makes 3D pictures clear even when moving fast.
Plain English Summary
Imagine trying to take a clear picture or video while moving really fast β it often comes out blurry. This new method uses special "event" cameras that capture movement instantly, like a super-fast shutter, to create incredibly detailed 3D models even when things are moving quickly. This means we can build more accurate virtual worlds or digital twins of real places, even in challenging, fast-paced environments like a moving car or a busy factory.
Gaussian Splatting SLAM (GS-SLAM) offers a notable improvement over traditional SLAM methods, enabling photorealistic 3D reconstruction that conventional approaches often struggle to achieve. However, existing GS-SLAM systems perform poorly under persistent and severe motion blur commonly encountered in real-world scenarios, leading to significantly degraded tracking accuracy and compromised 3D reconstruction quality. To address this limitation, we propose EGS-SLAM, a novel GS-SLAM framework that fuses event data with RGB-D inputs to simultaneously reduce motion blur in images and compensate for the sparse and discrete nature of event streams, enabling robust tracking and high-fidelity 3D Gaussian Splatting reconstruction. Specifically, our system explicitly models the camera's continuous trajectory during exposure, supporting event- and blur-aware tracking and mapping on a unified 3D Gaussian Splatting scene. Furthermore, we introduce a learnable camera response function to align the dynamic ranges of events and images, along with a no-event loss to suppress ringing artifacts during reconstruction. We validate our approach on a new dataset comprising synthetic and real-world sequences with significant motion blur. Extensive experimental results demonstrate that EGS-SLAM consistently outperforms existing GS-SLAM systems in both trajectory accuracy and photorealistic 3D Gaussian Splatting reconstruction. The source code will be available at https://github.com/Chensiyu00/EGS-SLAM.
Similar Papers
RoGER-SLAM: A Robust Gaussian Splatting SLAM System for Noisy and Low-light Environment Resilience
Robotics
Helps robots see and map in dark, noisy places.
FeatureSLAM: Feature-enriched 3D gaussian splatting SLAM in real time
CV and Pattern Recognition
Lets robots see and understand the world better.
Stereo 3D Gaussian Splatting SLAM for Outdoor Urban Scenes
Robotics
Maps outdoor places using only two cameras.