PGOT: A Physics-Geometry Operator Transformer for Complex PDEs
By: Zhuo Zhang , Xi Yang , Yuan Zhao and more
While Transformers have demonstrated remarkable potential in modeling Partial Differential Equations (PDEs), modeling large-scale unstructured meshes with complex geometries remains a significant challenge. Existing efficient architectures often employ feature dimensionality reduction strategies, which inadvertently induces Geometric Aliasing, resulting in the loss of critical physical boundary information. To address this, we propose the Physics-Geometry Operator Transformer (PGOT), designed to reconstruct physical feature learning through explicit geometry awareness. Specifically, we propose Spectrum-Preserving Geometric Attention (SpecGeo-Attention). Utilizing a ``physics slicing-geometry injection" mechanism, this module incorporates multi-scale geometric encodings to explicitly preserve multi-scale geometric features while maintaining linear computational complexity $O(N)$. Furthermore, PGOT dynamically routes computations to low-order linear paths for smooth regions and high-order non-linear paths for shock waves and discontinuities based on spatial coordinates, enabling spatially adaptive and high-precision physical field modeling. PGOT achieves consistent state-of-the-art performance across four standard benchmarks and excels in large-scale industrial tasks including airfoil and car designs.
Similar Papers
Geometry Aware Operator Transformer as an Efficient and Accurate Neural Surrogate for PDEs on Arbitrary Domains
Machine Learning (CS)
Solves hard math problems faster for engineers.
SAOT: An Enhanced Locality-Aware Spectral Transformer for Solving PDEs
Machine Learning (CS)
Makes computer models show tiny details better.
An Efficient Graph-Transformer Operator for Learning Physical Dynamics with Manifolds Embedding
Computational Engineering, Finance, and Science
Makes computer simulations of science faster.