Forcing and Diagnosing Failure Modes of Fourier Neural Operators Across Diverse PDE Families
By: Lennon Shikhman
Potential Business Impact:
Makes AI better at predicting complex changes.
Fourier Neural Operators (FNOs) have shown strong performance in learning solution maps of partial differential equations (PDEs), but their robustness under distribution shifts, long-horizon rollouts, and structural perturbations remains poorly understood. We present a systematic stress-testing framework that probes failure modes of FNOs across five qualitatively different PDE families: dispersive, elliptic, multi-scale fluid, financial, and chaotic systems. Rather than optimizing in-distribution accuracy, we design controlled stress tests--including parameter shifts, boundary or terminal condition changes, resolution extrapolation with spectral analysis, and iterative rollouts--to expose vulnerabilities such as spectral bias, compounding integration errors, and overfitting to restricted boundary regimes. Our large-scale evaluation (1{,}000 trained models) reveals that distribution shifts in parameters or boundary conditions can inflate errors by more than an order of magnitude, while resolution changes primarily concentrate error in high-frequency modes. Input perturbations generally do not amplify error, though worst-case scenarios (e.g., localized Poisson perturbations) remain challenging. These findings provide a comparative failure-mode atlas and actionable insights for improving robustness in operator learning.
Similar Papers
Fourier Neural Operators Explained: A Practical Perspective
Machine Learning (CS)
Teaches computers to solve hard math problems faster.
An Inverse Scattering Inspired Fourier Neural Operator for Time-Dependent PDE Learning
Machine Learning (CS)
Predicts chaotic systems accurately for much longer.
Fourier Neural Operators for Structural Dynamics Models: Challenges, Limitations and Advantages of Using a Spectrogram Loss
Computational Engineering, Finance, and Science
Makes computer models predict weather better.