P3D: Scalable Neural Surrogates for High-Resolution 3D Physics Simulations with Global Context
By: Benjamin Holzschuh , Georg Kohl , Florian Redinger and more
Potential Business Impact:
Makes computer simulations of nature run faster.
We present a scalable framework for learning deterministic and probabilistic neural surrogates for high-resolution 3D physics simulations. We introduce a hybrid CNN-Transformer backbone architecture targeted for 3D physics simulations, which significantly outperforms existing architectures in terms of speed and accuracy. Our proposed network can be pretrained on small patches of the simulation domain, which can be fused to obtain a global solution, optionally guided via a fast and scalable sequence-to-sequence model to include long-range dependencies. This setup allows for training large-scale models with reduced memory and compute requirements for high-resolution datasets. We evaluate our backbone architecture against a large set of baseline methods with the objective to simultaneously learn the dynamics of 14 different types of PDEs in 3D. We demonstrate how to scale our model to high-resolution isotropic turbulence with spatial resolutions of up to $512^3$. Finally, we demonstrate the versatility of our network by training it as a diffusion model to produce probabilistic samples of highly turbulent 3D channel flows across varying Reynolds numbers, accurately capturing the underlying flow statistics.
Similar Papers
Accurate and scalable deep Maxwell solvers using multilevel iterative methods
Computational Physics
Solves hard math problems faster with smart computer programs.
Low-rank surrogate modeling and stochastic zero-order optimization for training of neural networks with black-box layers
Machine Learning (CS)
Makes AI learn faster using light and math.
Training Transformers for Mesh-Based Simulations
Machine Learning (CS)
Makes computer simulations of physics faster and better.