Score: 0

Segmentation-Guided Neural Radiance Fields for Novel Street View Synthesis

Published: March 18, 2025 | arXiv ID: 2503.14219v1

By: Yizhou Li , Yusuke Monno , Masatoshi Okutomi and more

Potential Business Impact:

Creates realistic 3D views of outdoor scenes.

Business Areas:
Visual Search Internet Services

Recent advances in Neural Radiance Fields (NeRF) have shown great potential in 3D reconstruction and novel view synthesis, particularly for indoor and small-scale scenes. However, extending NeRF to large-scale outdoor environments presents challenges such as transient objects, sparse cameras and textures, and varying lighting conditions. In this paper, we propose a segmentation-guided enhancement to NeRF for outdoor street scenes, focusing on complex urban environments. Our approach extends ZipNeRF and utilizes Grounded SAM for segmentation mask generation, enabling effective handling of transient objects, modeling of the sky, and regularization of the ground. We also introduce appearance embeddings to adapt to inconsistent lighting across view sequences. Experimental results demonstrate that our method outperforms the baseline ZipNeRF, improving novel view synthesis quality with fewer artifacts and sharper details.

Country of Origin
🇯🇵 Japan

Page Count
7 pages

Category
Computer Science:
CV and Pattern Recognition