Reduced-Basis Deep Operator Learning for Parametric PDEs with Independently Varying Boundary and Source Data
By: Yueqi Wang, Guang Lin
Potential Business Impact:
Speeds up computer simulations by learning from math.
Parametric PDEs power modern simulation, design, and digital-twin systems, yet their many-query workloads still hinge on repeatedly solving large finite-element systems. Existing operator-learning approaches accelerate this process but often rely on opaque learned trunks, require extensive labeled data, or break down when boundary and source data vary independently from physical parameters. We introduce RB-DeepONet, a hybrid operator-learning framework that fuses reduced-basis (RB) numerical structure with the branch-trunk architecture of DeepONet. The trunk is fixed to a rigorously constructed RB space generated offline via Greedy selection, granting physical interpretability, stability, and certified error control. The branch network predicts only RB coefficients and is trained label-free using a projected variational residual that targets the RB-Galerkin solution. For problems with independently varying loads or boundary conditions, we develop boundary and source modal encodings that compress exogenous data into low-dimensional coordinates while preserving accuracy. Combined with affine or empirical interpolation decompositions, RB-DeepONet achieves a strict offline-online split: all heavy lifting occurs offline, and online evaluation scales only with the RB dimension rather than the full mesh. We provide convergence guarantees separating RB approximation error from statistical learning error, and numerical experiments show that RB-DeepONet attains accuracy competitive with intrusive RB-Galerkin, POD-DeepONet, and FEONet while using dramatically fewer trainable parameters and achieving significant speedups. This establishes RB-DeepONet as an efficient, stable, and interpretable operator learner for large-scale parametric PDEs.
Similar Papers
DeepONet Augmented by Randomized Neural Networks for Efficient Operator Learning in PDEs
Machine Learning (CS)
Solves hard math problems much faster.
Efficient Transformer-Inspired Variants of Physics-Informed Deep Operator Networks
Machine Learning (CS)
Makes computer math problems solve faster, more accurately.
DD-DeepONet: Domain decomposition and DeepONet for solving partial differential equations in three application scenarios
Numerical Analysis
Solves hard math problems much faster for engineers.