7DGS: Unified Spatial-Temporal-Angular Gaussian Splatting
By: Zhongpai Gao , Benjamin Planche , Meng Zheng and more
Potential Business Impact:
Makes videos look real, even when moving.
Real-time rendering of dynamic scenes with view-dependent effects remains a fundamental challenge in computer graphics. While recent advances in Gaussian Splatting have shown promising results separately handling dynamic scenes (4DGS) and view-dependent effects (6DGS), no existing method unifies these capabilities while maintaining real-time performance. We present 7D Gaussian Splatting (7DGS), a unified framework representing scene elements as seven-dimensional Gaussians spanning position (3D), time (1D), and viewing direction (3D). Our key contribution is an efficient conditional slicing mechanism that transforms 7D Gaussians into view- and time-conditioned 3D Gaussians, maintaining compatibility with existing 3D Gaussian Splatting pipelines while enabling joint optimization. Experiments demonstrate that 7DGS outperforms prior methods by up to 7.36 dB in PSNR while achieving real-time rendering (401 FPS) on challenging dynamic scenes with complex view-dependent effects. The project page is: https://gaozhongpai.github.io/7dgs/.
Similar Papers
1000+ FPS 4D Gaussian Splatting for Dynamic Scene Rendering
CV and Pattern Recognition
Makes videos of moving things load super fast.
Geometry-Consistent 4D Gaussian Splatting for Sparse-Input Dynamic View Synthesis
CV and Pattern Recognition
Creates realistic 3D scenes from few pictures.
AAA-Gaussians: Anti-Aliased and Artifact-Free 3D Gaussian Rendering
Graphics
Makes 3D pictures look real and smooth.