Score: 0

Dynamic Hyperparameter Importance for Efficient Multi-Objective Optimization

Published: January 6, 2026 | arXiv ID: 2601.03166v1

By: Daphne Theodorakopoulos, Marcel Wever, Marius Lindauer

Potential Business Impact:

Finds best computer settings faster for different jobs.

Business Areas:
Personalization Commerce and Shopping

Choosing a suitable ML model is a complex task that can depend on several objectives, e.g., accuracy, model size, fairness, inference time, or energy consumption. In practice, this requires trading off multiple, often competing, objectives through multi-objective optimization (MOO). However, existing MOO methods typically treat all hyperparameters as equally important, overlooking that hyperparameter importance (HPI) can vary significantly depending on the trade-off between objectives. We propose a novel dynamic optimization approach that prioritizes the most influential hyperparameters based on varying objective trade-offs during the search process, which accelerates empirical convergence and leads to better solutions. Building on prior work on HPI for MOO post-analysis, we now integrate HPI, calculated with HyperSHAP, into the optimization. For this, we leverage the objective weightings naturally produced by the MOO algorithm ParEGO and adapt the configuration space by fixing the unimportant hyperparameters, allowing the search to focus on the important ones. Eventually, we validate our method with diverse tasks from PyMOO and YAHPO-Gym. Empirical results demonstrate improvements in convergence speed and Pareto front quality compared to baselines.

Country of Origin
🇩🇪 Germany

Page Count
11 pages

Category
Computer Science:
Machine Learning (CS)