Bayesian Optimisation: Which Constraints Matter?
By: Xietao Wang Lin , Juan Ungredda , Max Butler and more
Bayesian optimisation has proven to be a powerful tool for expensive global black-box optimisation problems. In this paper, we propose new Bayesian optimisation variants of the popular Knowledge Gradient acquisition functions for problems with \emph{decoupled} black-box constraints, in which subsets of the objective and constraint functions may be evaluated independently. In particular, our methods aim to take into account that often only a handful of the constraints may be binding at the optimum, and hence we should evaluate only relevant constraints when trying to optimise a function. We empirically benchmark these methods against existing methods and demonstrate their superiority over the state-of-the-art.
Similar Papers
Function-on-Function Bayesian Optimization
Machine Learning (Stat)
Finds best settings for complex computer programs.
Global Optimization on Graph-Structured Data via Gaussian Processes with Spectral Representations
Machine Learning (CS)
Finds best patterns in connected data faster.
We Still Don't Understand High-Dimensional Bayesian Optimization
Machine Learning (CS)
Finds best solutions in huge, complex problems.