Merging Memory and Space: A State Space Neural Operator
By: Nodens F. Koren, Samuel Lanthaler
Potential Business Impact:
Teaches computers to solve complex science problems faster.
We propose the State Space Neural Operator (SS-NO), a compact architecture for learning solution operators of time-dependent partial differential equations (PDEs). Our formulation extends structured state space models (SSMs) to joint spatiotemporal modeling, introducing two key mechanisms: adaptive damping, which stabilizes learning by localizing receptive fields, and learnable frequency modulation, which enables data-driven spectral selection. These components provide a unified framework for capturing long-range dependencies with parameter efficiency. Theoretically, we establish connections between SSMs and neural operators, proving a universality theorem for convolutional architectures with full field-of-view. Empirically, SS-NO achieves state-of-the-art performance across diverse PDE benchmarks-including 1D Burgers' and Kuramoto-Sivashinsky equations, and 2D Navier-Stokes and compressible Euler flows-while using significantly fewer parameters than competing approaches. A factorized variant of SS-NO further demonstrates scalable performance on challenging 2D problems. Our results highlight the effectiveness of damping and frequency learning in operator modeling, while showing that lightweight factorization provides a complementary path toward efficient large-scale PDE learning.
Similar Papers
Merging Memory and Space: A Spatiotemporal State Space Neural Operator
Machine Learning (CS)
Solves complex science problems faster with less data.
Stable spectral neural operator for learning stiff PDE systems from limited data
Computational Physics
Learns hidden rules from few examples.
Temporal Neural Operator for Modeling Time-Dependent Physical Phenomena
Machine Learning (CS)
Teaches computers to predict future events accurately.