Scheduling the Off-Diagonal Weingarten Loss of Neural SDFs for CAD Models
By: Haotian Yin, Przemyslaw Musialski
Potential Business Impact:
Makes 3D models from scans more accurate.
Neural signed distance functions (SDFs) have become a powerful representation for geometric reconstruction from point clouds, yet they often require both gradient- and curvature-based regularization to suppress spurious warp and preserve structural fidelity. FlatCAD introduced the Off-Diagonal Weingarten (ODW) loss as an efficient second-order prior for CAD surfaces, approximating full-Hessian regularization at roughly half the computational cost. However, FlatCAD applies a fixed ODW weight throughout training, which is suboptimal: strong regularization stabilizes early optimization but suppresses detail recovery in later stages. We present scheduling strategies for the ODW loss that assign a high initial weight to stabilize optimization and progressively decay it to permit fine-scale refinement. We investigate constant, linear, quintic, and step interpolation schedules, as well as an increasing warm-up variant. Experiments on the ABC CAD dataset demonstrate that time-varying schedules consistently outperform fixed weights. Our method achieves up to a 35% improvement in Chamfer Distance over the FlatCAD baseline, establishing scheduling as a simple yet effective extension of curvature regularization for robust CAD reconstruction.
Similar Papers
A Finite Difference Approximation of Second Order Regularization of Neural-SDFs
Graphics
Makes 3D shape learning faster and use less power.
CAO: Curvature-Adaptive Optimization via Periodic Low-Rank Hessian Sketching
Machine Learning (CS)
Makes computer learning models train much faster.
Learning Compact Latent Space for Representing Neural Signed Distance Functions with High-fidelity Geometry Details
CV and Pattern Recognition
Lets computers create detailed 3D shapes from many examples.