NOWS: Neural Operator Warm Starts for Accelerating Iterative Solvers
By: Mohammad Sadegh Eshaghi , Cosmin Anitescu , Navid Valizadeh and more
Potential Business Impact:
Speeds up computer simulations by 90%.
Partial differential equations (PDEs) underpin quantitative descriptions across the physical sciences and engineering, yet high-fidelity simulation remains a major computational bottleneck for many-query, real-time, and design tasks. Data-driven surrogates can be strikingly fast but are often unreliable when applied outside their training distribution. Here we introduce Neural Operator Warm Starts (NOWS), a hybrid strategy that harnesses learned solution operators to accelerate classical iterative solvers by producing high-quality initial guesses for Krylov methods such as conjugate gradient and GMRES. NOWS leaves existing discretizations and solver infrastructures intact, integrating seamlessly with finite-difference, finite-element, isogeometric analysis, finite volume method, etc. Across our benchmarks, the learned initialization consistently reduces iteration counts and end-to-end runtime, resulting in a reduction of the computational time of up to 90 %, while preserving the stability and convergence guarantees of the underlying numerical algorithms. By combining the rapid inference of neural operators with the rigor of traditional solvers, NOWS provides a practical and trustworthy approach to accelerate high-fidelity PDE simulations.
Similar Papers
WoSNN: Stochastic Solver for PDEs with Machine Learning
Numerical Analysis
Solves hard math problems much faster.
Accelerating PDE Solvers with Equation-Recast Neural Operator Preconditioning
Machine Learning (CS)
Speeds up computer simulations of physics problems.
Data-Efficient Time-Dependent PDE Surrogates: Graph Neural Simulators vs. Neural Operators
Machine Learning (CS)
Helps computers learn science faster with less data.