SAOT: An Enhanced Locality-Aware Spectral Transformer for Solving PDEs
By: Chenhong Zhou, Jie Chen, Zaifeng Yang
Potential Business Impact:
Makes computer models show tiny details better.
Neural operators have shown great potential in solving a family of Partial Differential Equations (PDEs) by modeling the mappings between input and output functions. Fourier Neural Operator (FNO) implements global convolutions via parameterizing the integral operators in Fourier space. However, it often results in over-smoothing solutions and fails to capture local details and high-frequency components. To address these limitations, we investigate incorporating the spatial-frequency localization property of Wavelet transforms into the Transformer architecture. We propose a novel Wavelet Attention (WA) module with linear computational complexity to efficiently learn locality-aware features. Building upon WA, we further develop the Spectral Attention Operator Transformer (SAOT), a hybrid spectral Transformer framework that integrates WA's localized focus with the global receptive field of Fourier-based Attention (FA) through a gated fusion block. Experimental results demonstrate that WA significantly mitigates the limitations of FA and outperforms existing Wavelet-based neural operators by a large margin. By integrating the locality-aware and global spectral representations, SAOT achieves state-of-the-art performance on six operator learning benchmarks and exhibits strong discretization-invariant ability.
Similar Papers
Integrating Locality-Aware Attention with Transformers for General Geometry PDEs
Machine Learning (CS)
Solves tricky math problems on weird shapes.
LOGLO-FNO: Efficient Learning of Local and Global Features in Fourier Neural Operators
Machine Learning (CS)
Helps computers understand fast, swirling movements.
Self-Attention to Operator Learning-based 3D-IC Thermal Simulation
Machine Learning (CS)
**Speeds up computer chip cooling design 842 times.**