Iterated Population Based Training with Task-Agnostic Restarts
By: Alexander Chebykin, Tanja Alderliesten, Peter A. N. Bosman
Potential Business Impact:
**Finds best computer learning settings automatically.**
Hyperparameter Optimization (HPO) can lift the burden of tuning hyperparameters (HPs) of neural networks. HPO algorithms from the Population Based Training (PBT) family are efficient thanks to dynamically adjusting HPs every few steps of the weight optimization. Recent results indicate that the number of steps between HP updates is an important meta-HP of all PBT variants that can substantially affect their performance. Yet, no method or intuition is available for efficiently setting its value. We introduce Iterated Population Based Training (IPBT), a novel PBT variant that automatically adjusts this HP via restarts that reuse weight information in a task-agnostic way and leverage time-varying Bayesian optimization to reinitialize HPs. Evaluation on 8 image classification and reinforcement learning tasks shows that, on average, our algorithm matches or outperforms 5 previous PBT variants and other HPO algorithms (random search, ASHA, SMAC3), without requiring a budget increase or any changes to its HPs. The source code is available at https://github.com/AwesomeLemon/IPBT.
Similar Papers
Multiple-Frequencies Population-Based Training
Machine Learning (CS)
Helps AI learn better by trying many ways.
Dynamic Priors in Bayesian Optimization for Hyperparameter Optimization
Machine Learning (CS)
Lets users guide smart computer learning.
Hyperparameter Optimisation with Practical Interpretability and Explanation Methods in Probabilistic Curriculum Learning
Machine Learning (CS)
Makes computer learning faster and easier.