MeshSplatting: Differentiable Rendering with Opaque Meshes
By: Jan Held , Sanghyun Son , Renaud Vandeghen and more
Potential Business Impact:
Creates smooth 3D models for games and VR.
Primitive-based splatting methods like 3D Gaussian Splatting have revolutionized novel view synthesis with real-time rendering. However, their point-based representations remain incompatible with mesh-based pipelines that power AR/VR and game engines. We present MeshSplatting, a mesh-based reconstruction approach that jointly optimizes geometry and appearance through differentiable rendering. By enforcing connectivity via restricted Delaunay triangulation and refining surface consistency, MeshSplatting creates end-to-end smooth, visually high-quality meshes that render efficiently in real-time 3D engines. On Mip-NeRF360, it boosts PSNR by +0.69 dB over the current state-of-the-art MiLo for mesh-based novel view synthesis, while training 2x faster and using 2x less memory, bridging neural rendering and interactive 3D graphics for seamless real-time scene interaction. The project page is available at https://meshsplatting.github.io/.
Similar Papers
TagSplat: Topology-Aware Gaussian Splatting for Dynamic Mesh Modeling and Tracking
Graphics
Creates smooth, connected 3D shapes that move realistically.
MeshSplat: Generalizable Sparse-View Surface Reconstruction via Gaussian Splatting
Graphics
Creates 3D shapes from few pictures.
Neural Texture Splatting: Expressive 3D Gaussian Splatting for View Synthesis, Geometry, and Dynamic Reconstruction
CV and Pattern Recognition
Makes 3D pictures look more real and move.