The Kernel Manifold: A Geometric Approach to Gaussian Process Model Selection
By: Md Shafiqul Islam , Shakti Prasad Padhy , Douglas Allaire and more
Potential Business Impact:
Finds best computer models for better predictions.
Gaussian Process (GP) regression is a powerful nonparametric Bayesian framework, but its performance depends critically on the choice of covariance kernel. Selecting an appropriate kernel is therefore central to model quality, yet remains one of the most challenging and computationally expensive steps in probabilistic modeling. We present a Bayesian optimization framework built on kernel-of-kernels geometry, using expected divergence-based distances between GP priors to explore kernel space efficiently. A multidimensional scaling (MDS) embedding of this distance matrix maps a discrete kernel library into a continuous Euclidean manifold, enabling smooth BO. In this formulation, the input space comprises kernel compositions, the objective is the log marginal likelihood, and featurization is given by the MDS coordinates. When the divergence yields a valid metric, the embedding preserves geometry and produces a stable BO landscape. We demonstrate the approach on synthetic benchmarks, real-world time-series datasets, and an additive manufacturing case study predicting melt-pool geometry, achieving superior predictive accuracy and uncertainty calibration relative to baselines including Large Language Model (LLM)-guided search. This framework establishes a reusable probabilistic geometry for kernel search, with direct relevance to GP modeling and deep kernel learning.
Similar Papers
Deep Gaussian Processes with Gradients
Methodology
Helps computers learn from data with tricky shapes.
Optimal Kernel Learning for Gaussian Process Models with High-Dimensional Input
Machine Learning (CS)
Finds important settings to make computer models work better.
Concentration bounds on response-based vector embeddings of black-box generative models
Machine Learning (Stat)
Helps understand how AI makes answers.