Neural Preconditioning via Krylov Subspace Geometry
By: Nunzio Dimola, Alessandro Coclite, Paolo Zunino
Potential Business Impact:
Teaches computers to solve tricky math problems faster.
We propose a geometry-aware strategy for training neural preconditioners tailored to parametrized linear systems arising from the discretization of mixed-dimensional partial differential equations (PDEs). These systems are typically ill-conditioned because of the presence of embedded lower-dimensional structures and are solved using Krylov subspace methods. Our approach yields an approximation of the inverse operator employing a learning algorithm consisting of a two-stage training framework: an initial static pre-training phase, based on residual minimization, followed by a dynamic fine-tuning phase that incorporates solver convergence dynamics into training via a novel loss functional. This dynamic loss is defined by the principal angles between the residuals and the Krylov subspaces. It is evaluated using a differentiable implementation of the Flexible GMRES algorithm, which enables backpropagation through both the Arnoldi process and Givens rotations. The resulting neural preconditioner is explicitly optimized to improve early-stage convergence and reduce iteration counts in a family of 3D-1D mixed-dimensional problems with geometric variability of the 1D domain. Numerical experiments show that our solver-aligned approach significantly improves convergence rate, robustness, and generalization.
Similar Papers
Hybrid Iterative Solvers with Geometry-Aware Neural Preconditioners for Parametric PDEs
Machine Learning (CS)
Teaches computers to solve math problems on any shape.
Neural Approximate Inverse Preconditioners
Numerical Analysis
Teaches computers to solve hard math problems faster.
Neural Approximate Inverse Preconditioners
Numerical Analysis
Teaches computers to solve hard math problems faster.