Lengthscale-informed sparse grids for kernel methods in high dimensions
By: Elliot J. Addy, Jonas Latz, Aretha L. Teckentrup
Potential Business Impact:
Makes computer models work better in many dimensions.
Kernel interpolation, especially in the context of Gaussian process emulation, is a widely used technique in surrogate modelling, where the goal is to cheaply approximate an input-output map using a limited number of function evaluations. However, in high-dimensional settings, such methods typically suffer from the curse of dimensionality; the number of required evaluations to achieve a fixed approximation error grows exponentially with the input dimension. To overcome this, a common technique used in high-dimensional approximation methods, such as quasi-Monte Carlo and sparse grids, is to exploit functional anisotropy: the idea that some input dimensions are more 'sensitive' than others. In doing so, such methods can significantly reduce the dimension dependence in the error. In this work, we propose a generalisation of sparse grid methods that incorporates a form of anisotropy encoded by the lengthscale parameter in Mat\'ern kernels. We derive error bounds and perform numerical experiments that show that our approach enables effective emulation over arbitrarily high dimensions for functions exhibiting sufficient anisotropy.
Similar Papers
A Surrogate-Informed Framework for Sparse Grid Interpolation
Computational Engineering, Finance, and Science
Makes computer models work faster and more accurately.
Kernel Interpolation on Sparse Grids
Numerical Analysis
Makes computer predictions faster for huge amounts of data.
Efficiently parallelizable kernel-based multi-scale algorithm
Numerical Analysis
Makes computer calculations much faster.