Score: 2

ReBaNO: Reduced Basis Neural Operator Mitigating Generalization Gaps and Achieving Discretization Invariance

Published: September 11, 2025 | arXiv ID: 2509.09611v1

By: Haolan Zheng , Yanlai Chen , Jiequn Han and more

Potential Business Impact:

Solves physics problems faster with less data.

Business Areas:
A/B Testing Data and Analytics

We propose a novel data-lean operator learning algorithm, the Reduced Basis Neural Operator (ReBaNO), to solve a group of PDEs with multiple distinct inputs. Inspired by the Reduced Basis Method and the recently introduced Generative Pre-Trained Physics-Informed Neural Networks, ReBaNO relies on a mathematically rigorous greedy algorithm to build its network structure offline adaptively from the ground up. Knowledge distillation via task-specific activation function allows ReBaNO to have a compact architecture requiring minimal computational cost online while embedding physics. In comparison to state-of-the-art operator learning algorithms such as PCA-Net, DeepONet, FNO, and CNO, numerical results demonstrate that ReBaNO significantly outperforms them in terms of eliminating/shrinking the generalization gap for both in- and out-of-distribution tests and being the only operator learning algorithm achieving strict discretization invariance.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
32 pages

Category
Computer Science:
Machine Learning (CS)