Score: 0

Neural Preconditioning via Krylov Subspace Geometry

Published: July 21, 2025 | arXiv ID: 2507.15452v1

By: Nunzio Dimola, Alessandro Coclite, Paolo Zunino

Potential Business Impact:

Teaches computers to solve tricky math problems faster.

Business Areas:
Indoor Positioning Navigation and Mapping

We propose a geometry-aware strategy for training neural preconditioners tailored to parametrized linear systems arising from the discretization of mixed-dimensional partial differential equations (PDEs). These systems are typically ill-conditioned because of the presence of embedded lower-dimensional structures and are solved using Krylov subspace methods. Our approach yields an approximation of the inverse operator employing a learning algorithm consisting of a two-stage training framework: an initial static pre-training phase, based on residual minimization, followed by a dynamic fine-tuning phase that incorporates solver convergence dynamics into training via a novel loss functional. This dynamic loss is defined by the principal angles between the residuals and the Krylov subspaces. It is evaluated using a differentiable implementation of the Flexible GMRES algorithm, which enables backpropagation through both the Arnoldi process and Givens rotations. The resulting neural preconditioner is explicitly optimized to improve early-stage convergence and reduce iteration counts in a family of 3D-1D mixed-dimensional problems with geometric variability of the 1D domain. Numerical experiments show that our solver-aligned approach significantly improves convergence rate, robustness, and generalization.

Country of Origin
🇮🇹 Italy

Page Count
24 pages

Category
Mathematics:
Numerical Analysis (Math)