Omnipresent Yet Overlooked: Heat Kernels in Combinatorial Bayesian Optimization
By: Colin Doumont , Victor Picheny , Viacheslav Borovitskiy and more
Potential Business Impact:
Makes computers smarter at finding best designs.
Bayesian Optimization (BO) has the potential to solve various combinatorial tasks, ranging from materials science to neural architecture search. However, BO requires specialized kernels to effectively model combinatorial domains. Recent efforts have introduced several combinatorial kernels, but the relationships among them are not well understood. To bridge this gap, we develop a unifying framework based on heat kernels, which we derive in a systematic way and express as simple closed-form expressions. Using this framework, we prove that many successful combinatorial kernels are either related or equivalent to heat kernels, and validate this theoretical claim in our experiments. Moreover, our analysis confirms and extends the results presented in Bounce: certain algorithms' performance decreases substantially when the unknown optima of the function do not have a certain structure. In contrast, heat kernels are not sensitive to the location of the optima. Lastly, we show that a fast and simple pipeline, relying on heat kernels, is able to achieve state-of-the-art results, matching or even outperforming certain slow or complex algorithms.
Similar Papers
Adaptive Kernel Design for Bayesian Optimization Is a Piece of CAKE with LLMs
Machine Learning (CS)
Helps computers find best settings faster.
We Still Don't Understand High-Dimensional Bayesian Optimization
Machine Learning (CS)
Finds best solutions in huge, complex problems.
BOOST: Bayesian Optimization with Optimal Kernel and Acquisition Function Selection Technique
Machine Learning (CS)
Finds best settings for computer learning faster.