Generalizing PDE Emulation with Equation-Aware Neural Operators
By: Qian-Ze Zhu, Paul Raccuglia, Michael P. Brenner
Potential Business Impact:
AI learns to solve many math problems faster.
Solving partial differential equations (PDEs) can be prohibitively expensive using traditional numerical methods. Deep learning-based surrogate models typically specialize in a single PDE with fixed parameters. We present a framework for equation-aware emulation that generalizes to unseen PDEs, conditioning a neural model on a vector encoding representing the terms in a PDE and their coefficients. We present a baseline of four distinct modeling technqiues, trained on a family of 1D PDEs from the APEBench suite. Our approach achieves strong performance on parameter sets held out from the training distribution, with strong stability for rollout beyond the training window, and generalization to an entirely unseen PDE. This work was developed as part of a broader effort exploring AI systems that automate the creation of expert-level empirical software for scorable scientific tasks. The data and codebase are available at https://github.com/google-research/generalized-pde-emulator.
Similar Papers
Conditioning on PDE Parameters to Generalise Deep Learning Emulation of Stochastic and Chaotic Dynamics
Machine Learning (CS)
Makes computer simulations run much faster.
Neural Emulator Superiority: When Machine Learning for PDEs Surpasses its Training Data
Machine Learning (CS)
Computer models learn better than their training data.
Active Learning with Selective Time-Step Acquisition for PDEs
Machine Learning (CS)
Teaches computers to solve science problems faster.