POCAII: Parameter Optimization with Conscious Allocation using Iterative Intelligence
By: Joshua Inman , Tanmay Khandait , Lalitha Sankar and more
Potential Business Impact:
Finds best computer settings faster, saves time.
In this paper we propose for the first time the hyperparameter optimization (HPO) algorithm POCAII. POCAII differs from the Hyperband and Successive Halving literature by explicitly separating the search and evaluation phases and utilizing principled approaches to exploration and exploitation principles during both phases. Such distinction results in a highly flexible scheme for managing a hyperparameter optimization budget by focusing on search (i.e., generating competing configurations) towards the start of the HPO process while increasing the evaluation effort as the HPO comes to an end. POCAII was compared to state of the art approaches SMAC, BOHB and DEHB. Our algorithm shows superior performance in low-budget hyperparameter optimization regimes. Since many practitioners do not have exhaustive resources to assign to HPO, it has wide applications to real-world problems. Moreover, the empirical evidence showed how POCAII demonstrates higher robustness and lower variance in the results. This is again very important when considering realistic scenarios with extremely expensive models to train.
Similar Papers
Grouped Sequential Optimization Strategy -- the Application of Hyperparameter Importance Assessment in Deep Learning
Machine Learning (CS)
Makes computer learning faster by finding best settings.
Hyperparameter Optimisation with Practical Interpretability and Explanation Methods in Probabilistic Curriculum Learning
Machine Learning (CS)
Makes computer learning faster and easier.
Iterated Population Based Training with Task-Agnostic Restarts
Machine Learning (CS)
**Finds best computer learning settings automatically.**