Score: 1

A Finite Difference Approximation of Second Order Regularization of Neural-SDFs

Published: November 12, 2025 | arXiv ID: 2511.08980v1

By: Haotian Yin , Aleksander Plocharski , Michal Jan Wlodarczyk and more

Potential Business Impact:

Makes 3D shape learning faster and use less power.

Business Areas:
A/B Testing Data and Analytics

We introduce a finite-difference framework for curvature regularization in neural signed distance field (SDF) learning. Existing approaches enforce curvature priors using full Hessian information obtained via second-order automatic differentiation, which is accurate but computationally expensive. Others reduced this overhead by avoiding explicit Hessian assembly, but still required higher-order differentiation. In contrast, our method replaces these operations with lightweight finite-difference stencils that approximate second derivatives using the well known Taylor expansion with a truncation error of O(h^2), and can serve as drop-in replacements for Gaussian curvature and rank-deficiency losses. Experiments demonstrate that our finite-difference variants achieve reconstruction fidelity comparable to their automatic-differentiation counterparts, while reducing GPU memory usage and training time by up to a factor of two. Additional tests on sparse, incomplete, and non-CAD data confirm that the proposed formulation is robust and general, offering an efficient and scalable alternative for curvature-aware SDF learning.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡΅πŸ‡± United States, Poland

Page Count
6 pages

Category
Computer Science:
Graphics