Score: 2

Merging Memory and Space: A State Space Neural Operator

Published: July 31, 2025 | arXiv ID: 2507.23428v3

By: Nodens F. Koren, Samuel Lanthaler

Potential Business Impact:

Teaches computers to solve complex science problems faster.

Business Areas:
DSP Hardware

We propose the State Space Neural Operator (SS-NO), a compact architecture for learning solution operators of time-dependent partial differential equations (PDEs). Our formulation extends structured state space models (SSMs) to joint spatiotemporal modeling, introducing two key mechanisms: adaptive damping, which stabilizes learning by localizing receptive fields, and learnable frequency modulation, which enables data-driven spectral selection. These components provide a unified framework for capturing long-range dependencies with parameter efficiency. Theoretically, we establish connections between SSMs and neural operators, proving a universality theorem for convolutional architectures with full field-of-view. Empirically, SS-NO achieves state-of-the-art performance across diverse PDE benchmarks-including 1D Burgers' and Kuramoto-Sivashinsky equations, and 2D Navier-Stokes and compressible Euler flows-while using significantly fewer parameters than competing approaches. A factorized variant of SS-NO further demonstrates scalable performance on challenging 2D problems. Our results highlight the effectiveness of damping and frequency learning in operator modeling, while showing that lightweight factorization provides a complementary path toward efficient large-scale PDE learning.

Country of Origin
🇦🇹 🇨🇭 Switzerland, Austria

Page Count
31 pages

Category
Computer Science:
Machine Learning (CS)