StreamSplat: Towards Online Dynamic 3D Reconstruction from Uncalibrated Video Streams
By: Zike Wu , Qi Yan , Xuanyu Yi and more
Potential Business Impact:
Makes 3D movies from regular videos instantly.
Real-time reconstruction of dynamic 3D scenes from uncalibrated video streams is crucial for numerous real-world applications. However, existing methods struggle to jointly address three key challenges: 1) processing uncalibrated inputs in real time, 2) accurately modeling dynamic scene evolution, and 3) maintaining long-term stability and computational efficiency. To this end, we introduce StreamSplat, the first fully feed-forward framework that transforms uncalibrated video streams of arbitrary length into dynamic 3D Gaussian Splatting (3DGS) representations in an online manner, capable of recovering scene dynamics from temporally local observations. We propose two key technical innovations: a probabilistic sampling mechanism in the static encoder for 3DGS position prediction, and a bidirectional deformation field in the dynamic decoder that enables robust and efficient dynamic modeling. Extensive experiments on static and dynamic benchmarks demonstrate that StreamSplat consistently outperforms prior works in both reconstruction quality and dynamic scene modeling, while uniquely supporting online reconstruction of arbitrarily long video streams. Code and models are available at https://github.com/nickwzk/StreamSplat.
Similar Papers
StreamGS: Online Generalizable Gaussian Splatting Reconstruction for Unposed Image Streams
CV and Pattern Recognition
Makes 3D scenes from videos instantly.
ProDyG: Progressive Dynamic Scene Reconstruction via Gaussian Splatting from Monocular Videos
CV and Pattern Recognition
Builds 3D worlds from videos in real-time.
Online 3D Gaussian Splatting Modeling with Novel View Selection
CV and Pattern Recognition
Creates more complete 3D models from fewer pictures.