SVD-NO: Learning PDE Solution Operators with SVD Integral Kernels
By: Noam Koren , Ralf J. J. Mackenbach , Ruud J. G. van Sloun and more
Potential Business Impact:
Solves hard math problems faster with AI.
Neural operators have emerged as a promising paradigm for learning solution operators of partial differential equa- tions (PDEs) directly from data. Existing methods, such as those based on Fourier or graph techniques, make strong as- sumptions about the structure of the kernel integral opera- tor, assumptions which may limit expressivity. We present SVD-NO, a neural operator that explicitly parameterizes the kernel by its singular-value decomposition (SVD) and then carries out the integral directly in the low-rank basis. Two lightweight networks learn the left and right singular func- tions, a diagonal parameter matrix learns the singular values, and a Gram-matrix regularizer enforces orthonormality. As SVD-NO approximates the full kernel, it obtains a high de- gree of expressivity. Furthermore, due to its low-rank struc- ture the computational complexity of applying the operator remains reasonable, leading to a practical system. In exten- sive evaluations on five diverse benchmark equations, SVD- NO achieves a new state of the art. In particular, SVD-NO provides greater performance gains on PDEs whose solutions are highly spatially variable. The code of this work is publicly available at https://github.com/2noamk/SVDNO.git.
Similar Papers
Learning Solution Operators for Partial Differential Equations via Monte Carlo-Type Approximation
Machine Learning (CS)
Makes computer models solve problems faster and cheaper.
PODNO: Proper Orthogonal Decomposition Neural Operators
Numerical Analysis
Solves hard math problems faster and better.
Stable spectral neural operator for learning stiff PDE systems from limited data
Computational Physics
Learns hidden rules from few examples.