Adaptive sparse variational approximations for Gaussian process regression
By: Dennis Nieman, Botond Szabó
Potential Business Impact:
Helps computers learn better by fixing their settings.
Accurate tuning of hyperparameters is crucial to ensure that models can generalise effectively across different settings. In this paper, we present theoretical guarantees for hyperparameter selection using variational Bayes in the nonparametric regression model. We construct a variational approximation to a hierarchical Bayes procedure, and derive upper bounds for the contraction rate of the variational posterior in an abstract setting. The theory is applied to various Gaussian process priors and variational classes, resulting in minimax optimal rates. Our theoretical results are accompanied with numerical analysis both on synthetic and real world data sets.
Similar Papers
STRIDE: Sparse Techniques for Regression in Deep Gaussian Processes
Machine Learning (Stat)
Teaches computers to learn from lots of data.
Variational inference for hierarchical models with conditional scale and skewness corrections
Methodology
Makes computer models understand messy data better.
Variational Inference with Mixtures of Isotropic Gaussians
Machine Learning (Stat)
Finds better computer guesses for complex problems.