Dynamic Hyperparameter Importance for Efficient Multi-Objective Optimization
By: Daphne Theodorakopoulos, Marcel Wever, Marius Lindauer
Potential Business Impact:
Finds best computer settings faster for different jobs.
Choosing a suitable ML model is a complex task that can depend on several objectives, e.g., accuracy, model size, fairness, inference time, or energy consumption. In practice, this requires trading off multiple, often competing, objectives through multi-objective optimization (MOO). However, existing MOO methods typically treat all hyperparameters as equally important, overlooking that hyperparameter importance (HPI) can vary significantly depending on the trade-off between objectives. We propose a novel dynamic optimization approach that prioritizes the most influential hyperparameters based on varying objective trade-offs during the search process, which accelerates empirical convergence and leads to better solutions. Building on prior work on HPI for MOO post-analysis, we now integrate HPI, calculated with HyperSHAP, into the optimization. For this, we leverage the objective weightings naturally produced by the MOO algorithm ParEGO and adapt the configuration space by fixing the unimportant hyperparameters, allowing the search to focus on the important ones. Eventually, we validate our method with diverse tasks from PyMOO and YAHPO-Gym. Empirical results demonstrate improvements in convergence speed and Pareto front quality compared to baselines.
Similar Papers
Grouped Sequential Optimization Strategy -- the Application of Hyperparameter Importance Assessment in Deep Learning
Machine Learning (CS)
Makes computer learning faster by finding best settings.
From Black-Box Tuning to Guided Optimization via Hyperparameters Interaction Analysis
Machine Learning (CS)
Helps computers learn better by finding best settings.
Parametric Expensive Multi-Objective Optimization via Generative Solution Modeling
Machine Learning (CS)
Solves many hard problems faster, without re-testing.