Unbiased Stochastic Optimization for Gaussian Processes on Finite Dimensional RKHS
By: Neta Shoham, Haim Avron
Potential Business Impact:
Makes computer learning faster and more accurate.
Current methods for stochastic hyperparameter learning in Gaussian Processes (GPs) rely on approximations, such as computing biased stochastic gradients or using inducing points in stochastic variational inference. However, when using such methods we are not guaranteed to converge to a stationary point of the true marginal likelihood. In this work, we propose algorithms for exact stochastic inference of GPs with kernels that induce a Reproducing Kernel Hilbert Space (RKHS) of moderate finite dimension. Our approach can also be extended to infinite dimensional RKHSs at the cost of forgoing exactness. Both for finite and infinite dimensional RKHSs, our method achieves better experimental results than existing methods when memory resources limit the feasible batch size and the possible number of inducing points.
Similar Papers
A Kernel-based Stochastic Approximation Framework for Nonlinear Operator Learning
Machine Learning (Stat)
Teaches computers to solve hard math problems.
Kernel-based Stochastic Approximation Framework for Nonlinear Operator Learning
Machine Learning (Stat)
Teaches computers to solve hard math problems.
Kernel-Based Nonparametric Tests For Shape Constraints
Machine Learning (Stat)
Helps make better money choices with math.