Score: 2

SAOT: An Enhanced Locality-Aware Spectral Transformer for Solving PDEs

Published: November 24, 2025 | arXiv ID: 2511.18777v1

By: Chenhong Zhou, Jie Chen, Zaifeng Yang

Potential Business Impact:

Makes computer models show tiny details better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Neural operators have shown great potential in solving a family of Partial Differential Equations (PDEs) by modeling the mappings between input and output functions. Fourier Neural Operator (FNO) implements global convolutions via parameterizing the integral operators in Fourier space. However, it often results in over-smoothing solutions and fails to capture local details and high-frequency components. To address these limitations, we investigate incorporating the spatial-frequency localization property of Wavelet transforms into the Transformer architecture. We propose a novel Wavelet Attention (WA) module with linear computational complexity to efficiently learn locality-aware features. Building upon WA, we further develop the Spectral Attention Operator Transformer (SAOT), a hybrid spectral Transformer framework that integrates WA's localized focus with the global receptive field of Fourier-based Attention (FA) through a gated fusion block. Experimental results demonstrate that WA significantly mitigates the limitations of FA and outperforms existing Wavelet-based neural operators by a large margin. By integrating the locality-aware and global spectral representations, SAOT achieves state-of-the-art performance on six operator learning benchmarks and exhibits strong discretization-invariant ability.

Country of Origin
🇭🇰 Hong Kong

Repos / Data Links

Page Count
9 pages

Category
Computer Science:
Machine Learning (CS)