Neural Approximate Inverse Preconditioners
By: Tianshi Xu, Rui Peng Li, Yuanzhe Xi
Potential Business Impact:
Teaches computers to solve hard math problems faster.
In this paper, we propose a data-driven framework for constructing efficient approximate inverse preconditioners for elliptic partial differential equations (PDEs) by learning the Green's function of the underlying operator with neural networks (NNs). The training process integrates four key components: an adaptive multiscale neural architecture ($\alpha$MSNN) that captures hierarchical features across near-, middle-, and far-field regimes; the use of coarse-grid anchor data to ensure physical identifiability; a multi-$\varepsilon$ staged training protocol that progressively refines the Green's function representation across spatial scales; and an overlapping domain decomposition that enables local adaptation while maintaining global consistency. Once trained, the NN-approximated Green's function is directly compressed into either a hierarchical ($\mathcal{H}$-) matrix or a sparse matrix-using only the mesh geometry and the network output. This geometric construction achieves nearly linear complexity in both setup and application while preserving the spectral properties essential for effective preconditioning. Numerical experiments on challenging elliptic PDEs demonstrate that the resulting preconditioners consistently yield fast convergence and small iteration counts.
Similar Papers
Neural Approximate Inverse Preconditioners
Numerical Analysis
Teaches computers to solve hard math problems faster.
Numerical Solution of Mixed-Dimensional PDEs Using a Neural Preconditioner
Numerical Analysis
Teaches computers to solve tricky math problems faster.
A matrix preconditioning framework for physics-informed neural networks based on adjoint method
Numerical Analysis
Makes computer models solve hard science problems faster.