Towards a Foundation Model for Partial Differential Equations Across Physics Domains
By: Eduardo Soares , Emilio Vital Brazil , Victor Shirasuna and more
Potential Business Impact:
Predicts how things move and change using math.
We present PDE-FM, a modular foundation model for physics-informed machine learning that unifies spatial, spectral, and temporal reasoning across heterogeneous partial differential equation (PDE) systems. PDE-FM combines spatial-spectral tokenization, physics-aware conditioning, and a Mamba-based state-space backbone with an operator-theoretic decoder, enabling scalable and data-efficient modeling of complex physical dynamics. In contrast to task-specific neural operators, PDE-FM is pretrained once on diverse PDE datasets and can be transferred to new physical regimes without architectural or data-specific modifications. Evaluated on twelve 2D and 3D datasets from The Well benchmark - spanning hydrodynamic, radiative, elastic, and astrophysical phenomena - PDE-FM achieves state-of-the-art accuracy in six domains, reducing mean VRMSE by 46% relative to prior operator-learning baselines. The model demonstrates robust cross-physics generalization, excelling in turbulent and radiative systems while maintaining strong performance in linear and steady-state regimes. These results suggest that large-scale pretraining across diverse physical processes can yield transferable representations of dynamics, marking a step toward unified, foundation-level surrogates for multi-physics simulation and scientific discovery.
Similar Papers
Generalizing PDE Emulation with Equation-Aware Neural Operators
Machine Learning (CS)
AI learns to solve many math problems faster.
Hierarchical Physics-Embedded Learning for Spatiotemporal Dynamical Systems
Machine Learning (CS)
Finds hidden science rules from messy data.
Multi-Operator Few-Shot Learning for Generalization Across PDE Families
Machine Learning (CS)
Teaches computers to solve new math problems with few examples.