Score: 1

Numerical PDE solvers outperform neural PDE solvers

Published: July 28, 2025 | arXiv ID: 2507.21269v1

By: Patrick Chatain , Michael Rizvi-Martel , Guillaume Rabusseau and more

Potential Business Impact:

Teaches computers to solve tricky science puzzles.

Business Areas:
DSP Hardware

We present DeepFDM, a differentiable finite-difference framework for learning spatially varying coefficients in time-dependent partial differential equations (PDEs). By embedding a classical forward-Euler discretization into a convolutional architecture, DeepFDM enforces stability and first-order convergence via CFL-compliant coefficient parameterizations. Model weights correspond directly to PDE coefficients, yielding an interpretable inverse-problem formulation. We evaluate DeepFDM on a benchmark suite of scalar PDEs: advection, diffusion, advection-diffusion, reaction-diffusion and inhomogeneous Burgers' equations-in one, two and three spatial dimensions. In both in-distribution and out-of-distribution tests (quantified by the Hellinger distance between coefficient priors), DeepFDM attains normalized mean-squared errors one to two orders of magnitude smaller than Fourier Neural Operators, U-Nets and ResNets; requires 10-20X fewer training epochs; and uses 5-50X fewer parameters. Moreover, recovered coefficient fields accurately match ground-truth parameters. These results establish DeepFDM as a robust, efficient, and transparent baseline for data-driven solution and identification of parametric PDEs.

Repos / Data Links

Page Count
17 pages

Category
Mathematics:
Numerical Analysis (Math)