Regularized Random Fourier Features and Finite Element Reconstruction for Operator Learning in Sobolev Space
By: Xinyue Yu, Hayden Schaeffer
Operator learning is a data-driven approximation of mappings between infinite-dimensional function spaces, such as the solution operators of partial differential equations. Kernel-based operator learning can offer accurate, theoretically justified approximations that require less training than standard methods. However, they can become computationally prohibitive for large training sets and can be sensitive to noise. We propose a regularized random Fourier feature (RRFF) approach, coupled with a finite element reconstruction map (RRFF-FEM), for learning operators from noisy data. The method uses random features drawn from multivariate Student's $t$ distributions, together with frequency-weighted Tikhonov regularization that suppresses high-frequency noise. We establish high-probability bounds on the extreme singular values of the associated random feature matrix and show that when the number of features $N$ scales like $m \log m$ with the number of training samples $m$, the system is well-conditioned, which yields estimation and generalization guarantees. Detailed numerical experiments on benchmark PDE problems, including advection, Burgers', Darcy flow, Helmholtz, Navier-Stokes, and structural mechanics, demonstrate that RRFF and RRFF-FEM are robust to noise and achieve improved performance with reduced training time compared to the unregularized random feature model, while maintaining competitive accuracy relative to kernel and neural operator tests.
Similar Papers
Cauchy Random Features for Operator Learning in Sobolev Space
Machine Learning (CS)
Teaches computers to learn math faster.
Learning convolution operators on compact Abelian groups
Machine Learning (CS)
Teaches computers to understand patterns in sound.
Rates and architectures for learning geometrically non-trivial operators
Machine Learning (CS)
Learns math problems from few examples.