Convergence Rates for Learning Pseudo-Differential Operators
By: Jiaheng Chen, Daniel Sanz-Alonso
Potential Business Impact:
Teaches computers to solve hard math problems faster.
This paper establishes convergence rates for learning elliptic pseudo-differential operators, a fundamental operator class in partial differential equations and mathematical physics. In a wavelet-Galerkin framework, we formulate learning over this class as a structured infinite-dimensional regression problem with multiscale sparsity. Building on this structure, we propose a sparse, data- and computation-efficient estimator, which leverages a novel matrix compression scheme tailored to the learning task and a nested-support strategy to balance approximation and estimation errors. In addition to obtaining convergence rates for the estimator, we show that the learned operator induces an efficient and stable Galerkin solver whose numerical error matches its statistical accuracy. Our results therefore contribute to bringing together operator learning, data-driven solvers, and wavelet methods in scientific computing.
Similar Papers
Rates and architectures for learning geometrically non-trivial operators
Machine Learning (CS)
Learns math problems from few examples.
Operator Learning: A Statistical Perspective
Machine Learning (Stat)
Teaches computers to predict how things work.
Solving Functional PDEs with Gaussian Processes and Applications to Functional Renormalization Group Equations
Machine Learning (CS)
Helps scientists solve hard math problems faster.