Uniform convergence for Gaussian kernel ridge regression
By: Paul Dommel, Rajmadan Lakshmanan
Potential Business Impact:
Makes computer learning more accurate and predictable.
This paper establishes the first polynomial convergence rates for Gaussian kernel ridge regression (KRR) with a fixed hyperparameter in both the uniform and the $L^{2}$-norm. The uniform convergence result closes a gap in the theoretical understanding of KRR with the Gaussian kernel, where no such rates were previously known. In addition, we prove a polynomial $L^{2}$-convergence rate in the case, where the Gaussian kernel's width parameter is fixed. This also contributes to the broader understanding of smooth kernels, for which previously only sub-polynomial $L^{2}$-rates were known in similar settings. Together, these results provide new theoretical justification for the use of Gaussian KRR with fixed hyperparameters in nonparametric regression.
Similar Papers
On the Rate of Gaussian Approximation for Linear Regression Problems
Machine Learning (Stat)
Helps computers guess better with more data.
A general technique for approximating high-dimensional empirical kernel matrices
Machine Learning (Stat)
Makes computer predictions more accurate for complex data.
Convergence Rates for Realizations of Gaussian Random Variables
Statistics Theory
Helps computers learn from less data.