An Efficient Graph-Transformer Operator for Learning Physical Dynamics with Manifolds Embedding
By: Pengwei Liu , Xingyu Ren , Pengkai Wang and more
Potential Business Impact:
Makes computer simulations of science faster.
Accurate and efficient physical simulations are essential in science and engineering, yet traditional numerical solvers face significant challenges in computational cost when handling simulations across dynamic scenarios involving complex geometries, varying boundary/initial conditions, and diverse physical parameters. While deep learning offers promising alternatives, existing methods often struggle with flexibility and generalization, particularly on unstructured meshes, which significantly limits their practical applicability. To address these challenges, we propose PhysGTO, an efficient Graph-Transformer Operator for learning physical dynamics through explicit manifold embeddings in both physical and latent spaces. In the physical space, the proposed Unified Graph Embedding module aligns node-level conditions and constructs sparse yet structure-preserving graph connectivity to process heterogeneous inputs. In the latent space, PhysGTO integrates a lightweight flux-oriented message-passing scheme with projection-inspired attention to capture local and global dependencies, facilitating multilevel interactions among complex physical correlations. This design ensures linear complexity relative to the number of mesh points, reducing both the number of trainable parameters and computational costs in terms of floating-point operations (FLOPs), and thereby allowing efficient inference in real-time applications. We introduce a comprehensive benchmark spanning eleven datasets, covering problems with unstructured meshes, transient flow dynamics, and large-scale 3D geometries. PhysGTO consistently achieves state-of-the-art accuracy while significantly reducing computational costs, demonstrating superior flexibility, scalability, and generalization in a wide range of simulation tasks.
Similar Papers
GITO: Graph-Informed Transformer Operator for Learning Complex Partial Differential Equations
Machine Learning (CS)
Solves hard math problems on any shape.
Training Transformers for Mesh-Based Simulations
Machine Learning (CS)
Makes computer simulations of physics faster and better.
Topology-aware Neural Flux Prediction Guided by Physics
Machine Learning (CS)
Helps computers understand traffic flow better.