General superconvergence for kernel-based approximation
By: Toni Karvonen, Gabriele Santin, Tizian Wenzel
Potential Business Impact:
Makes math guesses faster for smoother problems.
Kernel interpolation is a fundamental technique for approximating functions from scattered data, with a well-understood convergence theory when interpolating elements of a reproducing kernel Hilbert space. Beyond this classical setting, research has focused on two regimes: misspecified interpolation, where the kernel smoothness exceeds that of the target function, and superconvergence, where the target is smoother than the Hilbert space. This work addresses the latter, where smoother target functions yield improved convergence rates, and extends existing results by characterizing superconvergence for projections in general Hilbert spaces. We show that functions lying in ranges of certain operators, including adjoint of embeddings, exhibit accelerated convergence, which we extend across interpolation scales between these ranges and the full Hilbert space. In particular, we analyze Mercer operators and embeddings into $L_p$ spaces, linking the images of adjoint operators to Mercer power spaces. Applications to Sobolev spaces are discussed in detail, highlighting how superconvergence depends critically on boundary conditions. Our findings generalize and refine previous results, offering a broader framework for understanding and exploiting superconvergence. The results are supported by numerical experiments.
Similar Papers
Sobolev norm inconsistency of kernel interpolation
Machine Learning (Stat)
Makes computers learn better by understanding data patterns.
On the Convergence of Irregular Sampling in Reproducing Kernel Hilbert Spaces
Machine Learning (Stat)
Makes computer learning better with less data.
A Kernel-based Stochastic Approximation Framework for Nonlinear Operator Learning
Machine Learning (Stat)
Teaches computers to solve hard math problems.