OPBO: Order-Preserving Bayesian Optimization
By: Wei Peng , Jianchen Hu , Kang Liu and more
Bayesian optimization is an effective method for solving expensive black-box optimization problems. Most existing methods use Gaussian processes (GP) as the surrogate model for approximating the black-box objective function, it is well-known that it can fail in high-dimensional space (e.g., dimension over 500). We argue that the reliance of GP on precise numerical fitting is fundamentally ill-suited in high-dimensional space, where it leads to prohibitive computational complexity. In order to address this, we propose a simple order-preserving Bayesian optimization (OPBO) method, where the surrogate model preserves the order, instead of the value, of the black-box objective function. Then we can use a simple but effective OP neural network (NN) to replace GP as the surrogate model. Moreover, instead of searching for the best solution from the acquisition model, we select good-enough solutions in the ordinal set to reduce computational cost. The experimental results show that for high-dimensional (over 500) black-box optimization problems, the proposed OPBO significantly outperforms traditional BO methods based on regression NN and GP. The source code is available at https://github.com/pengwei222/OPBO.
Similar Papers
Scalable Neural Network-based Blackbox Optimization
Machine Learning (CS)
Finds best answers faster, even with many choices.
Towards Scalable Bayesian Optimization via Gradient-Informed Bayesian Neural Networks
Machine Learning (CS)
Makes computer learning faster by using more math.
On the Implementation of a Bayesian Optimization Framework for Interconnected Systems
Machine Learning (Stat)
Finds best answers faster by using known parts.