Fast kernel methods: Sobolev, physics-informed, and additive models
By: Nathan Doumèche , Francis Bach , Gérard Biau and more
Potential Business Impact:
Makes big computer learning tasks much faster.
Kernel methods are powerful tools in statistical learning, but their cubic complexity in the sample size n limits their use on large-scale datasets. In this work, we introduce a scalable framework for kernel regression with O(n log n) complexity, fully leveraging GPU acceleration. The approach is based on a Fourier representation of kernels combined with non-uniform fast Fourier transforms (NUFFT), enabling exact, fast, and memory-efficient computations. We instantiate our framework in three settings: Sobolev kernel regression, physics-informed regression, and additive models. When known, the proposed estimators are shown to achieve minimax convergence rates, consistent with classical kernel theory. Empirical results demonstrate that our methods can process up to tens of billions of samples within minutes, providing both statistical accuracy and computational scalability. These contributions establish a flexible approach, paving the way for the routine application of kernel methods in large-scale learning tasks.
Similar Papers
Numerical Methods for Kernel Slicing
Numerical Analysis
Finds patterns faster in huge amounts of data.
Data-Efficient Kernel Methods for Learning Differential Equations and Their Solution Operators: Algorithms and Error Analysis
Machine Learning (Stat)
Teaches computers to solve math problems faster.
Preconditioned Additive Gaussian Processes with Fourier Acceleration
Machine Learning (CS)
Makes computer predictions faster and more accurate.