Score: 0

A Variational Framework for Residual-Based Adaptivity in Neural PDE Solvers and Operator Learning

Published: September 17, 2025 | arXiv ID: 2509.14198v1

By: Juan Diego Toscano , Daniel T. Chen , Vivek Oommen and more

Potential Business Impact:

Makes computer learning faster and more accurate.

Business Areas:
A/B Testing Data and Analytics

Residual-based adaptive strategies are widely used in scientific machine learning but remain largely heuristic. We introduce a unifying variational framework that formalizes these methods by integrating convex transformations of the residual. Different transformations correspond to distinct objective functionals: exponential weights target the minimization of uniform error, while linear weights recover the minimization of quadratic error. Within this perspective, adaptive weighting is equivalent to selecting sampling distributions that optimize the primal objective, thereby linking discretization choices directly to error metrics. This principled approach yields three benefits: (1) it enables systematic design of adaptive schemes across norms, (2) reduces discretization error through variance reduction of the loss estimator, and (3) enhances learning dynamics by improving the gradient signal-to-noise ratio. Extending the framework to operator learning, we demonstrate substantial performance gains across optimizers and architectures. Our results provide a theoretical justification of residual-based adaptivity and establish a foundation for principled discretization and training strategies.

Country of Origin
🇺🇸 United States

Page Count
50 pages

Category
Computer Science:
Machine Learning (CS)