Nexels: Neurally-Textured Surfels for Real-Time Novel View Synthesis with Sparse Geometries
By: Victor Rong , Jan Held , Victor Chu and more
Though Gaussian splatting has achieved impressive results in novel view synthesis, it requires millions of primitives to model highly textured scenes, even when the geometry of the scene is simple. We propose a representation that goes beyond point-based rendering and decouples geometry and appearance in order to achieve a compact representation. We use surfels for geometry and a combination of a global neural field and per-primitive colours for appearance. The neural field textures a fixed number of primitives for each pixel, ensuring that the added compute is low. Our representation matches the perceptual quality of 3D Gaussian splatting while using $9.7\times$ fewer primitives and $5.5\times$ less memory on outdoor scenes and using $31\times$ fewer primitives and $3.7\times$ less memory on indoor scenes. Our representation also renders twice as fast as existing textured primitives while improving upon their visual quality.
Similar Papers
Neural Texture Splatting: Expressive 3D Gaussian Splatting for View Synthesis, Geometry, and Dynamic Reconstruction
CV and Pattern Recognition
Makes 3D pictures look more real and move.
SparseSurf: Sparse-View 3D Gaussian Splatting for Surface Reconstruction
CV and Pattern Recognition
Builds better 3D worlds from fewer pictures.
MeshSplatting: Differentiable Rendering with Opaque Meshes
CV and Pattern Recognition
Creates smooth 3D models for games and VR.