Adaptive Kernel Design for Bayesian Optimization Is a Piece of CAKE with LLMs
By: Richard Cornelius Suwandi , Feng Yin , Juntao Wang and more
Potential Business Impact:
Helps computers find best settings faster.
The efficiency of Bayesian optimization (BO) relies heavily on the choice of the Gaussian process (GP) kernel, which plays a central role in balancing exploration and exploitation under limited evaluation budgets. Traditional BO methods often rely on fixed or heuristic kernel selection strategies, which can result in slow convergence or suboptimal solutions when the chosen kernel is poorly suited to the underlying objective function. To address this limitation, we propose a freshly-baked Context-Aware Kernel Evolution (CAKE) to enhance BO with large language models (LLMs). Concretely, CAKE leverages LLMs as the crossover and mutation operators to adaptively generate and refine GP kernels based on the observed data throughout the optimization process. To maximize the power of CAKE, we further propose BIC-Acquisition Kernel Ranking (BAKER) to select the most effective kernel through balancing the model fit measured by the Bayesian information criterion (BIC) with the expected improvement at each iteration of BO. Extensive experiments demonstrate that our fresh CAKE-based BO method consistently outperforms established baselines across a range of real-world tasks, including hyperparameter optimization, controller tuning, and photonic chip design. Our code is publicly available at https://github.com/richardcsuwandi/cake.
Similar Papers
Omnipresent Yet Overlooked: Heat Kernels in Combinatorial Bayesian Optimization
Machine Learning (CS)
Makes computers smarter at finding best designs.
BOOST: Bayesian Optimization with Optimal Kernel and Acquisition Function Selection Technique
Machine Learning (CS)
Finds best settings for computer learning faster.
On the Implementation of a Bayesian Optimization Framework for Interconnected Systems
Machine Learning (Stat)
Finds best answers faster by using known parts.