A Finite Difference Approximation of Second Order Regularization of Neural-SDFs
By: Haotian Yin , Aleksander Plocharski , Michal Jan Wlodarczyk and more
Potential Business Impact:
Makes 3D shape learning faster and use less power.
We introduce a finite-difference framework for curvature regularization in neural signed distance field (SDF) learning. Existing approaches enforce curvature priors using full Hessian information obtained via second-order automatic differentiation, which is accurate but computationally expensive. Others reduced this overhead by avoiding explicit Hessian assembly, but still required higher-order differentiation. In contrast, our method replaces these operations with lightweight finite-difference stencils that approximate second derivatives using the well known Taylor expansion with a truncation error of O(h^2), and can serve as drop-in replacements for Gaussian curvature and rank-deficiency losses. Experiments demonstrate that our finite-difference variants achieve reconstruction fidelity comparable to their automatic-differentiation counterparts, while reducing GPU memory usage and training time by up to a factor of two. Additional tests on sparse, incomplete, and non-CAD data confirm that the proposed formulation is robust and general, offering an efficient and scalable alternative for curvature-aware SDF learning.
Similar Papers
Scheduling the Off-Diagonal Weingarten Loss of Neural SDFs for CAD Models
Graphics
Makes 3D models from scans more accurate.
$\nabla$-SDF: Learning Euclidean Signed Distance Functions Online with Gradient-Augmented Octree Interpolation and Neural Residual
Robotics
Helps robots build accurate 3D maps faster.
Learning Compact Latent Space for Representing Neural Signed Distance Functions with High-fidelity Geometry Details
CV and Pattern Recognition
Lets computers create detailed 3D shapes from many examples.