Simulated Annealing-based Candidate Optimization for Batch Acquisition Functions
By: Sk Md Ahnaf Akif Alvi, Raymundo Arróyave, Douglas Allaire
Potential Business Impact:
Finds better answers for complex problems faster.
Bayesian Optimization with multi-objective acquisition functions such as q-Expected Hypervolume Improvement (qEHVI) requires efficient candidate optimization to maximize acquisition function values. Traditional approaches rely on continuous optimization methods like Sequential Least Squares Programming (SLSQP) for candidate selection. However, these gradient-based methods can become trapped in local optima, particularly in complex or high-dimensional objective landscapes. This paper presents a simulated annealing-based approach for candidate optimization in batch acquisition functions as an alternative to conventional continuous optimization methods. We evaluate our simulated annealing approach against SLSQP across four benchmark multi-objective optimization problems: ZDT1 (30D, 2 objectives), DTLZ2 (7D, 3 objectives), Kursawe (3D, 2 objectives), and Latent-Aware (4D, 2 objectives). Our results demonstrate that simulated annealing consistently achieves superior hypervolume performance compared to SLSQP in most test functions. The improvement is particularly pronounced for DTLZ2 and Latent-Aware problems, where simulated annealing reaches significantly higher hypervolume values and maintains better convergence characteristics. The histogram analysis of objective space coverage further reveals that simulated annealing explores more diverse and optimal regions of the Pareto front. These findings suggest that metaheuristic optimization approaches like simulated annealing can provide more robust and effective candidate optimization for multi-objective Bayesian optimization, offering a promising alternative to traditional gradient-based methods for batch acquisition function optimization.
Similar Papers
Batch Acquisition Function Evaluations and Decouple Optimizer Updates for Faster Bayesian Optimization
Machine Learning (CS)
Makes computer searches for best settings much faster.
Batch Acquisition Function Evaluations and Decouple Optimizer Updates for Faster Bayesian Optimization
Machine Learning (CS)
Makes computer learning faster and more accurate.
Generative Multi-Objective Bayesian Optimization with Scalable Batch Evaluations for Sample-Efficient De Novo Molecular Design
Machine Learning (Stat)
Finds new battery materials faster.