Optimization on black-box function by parameter-shift rule
By: Vu Tuan Hai
Potential Business Impact:
Trains computers faster with fewer guesses.
Machine learning has been widely applied in many aspects, but training a machine learning model is increasingly difficult. There are more optimization problems named "black-box" where the relationship between model parameters and outcomes is uncertain or complex to trace. Currently, optimizing black-box models that need a large number of query observations and parameters becomes difficult. To overcome the drawbacks of the existing algorithms, in this study, we propose a zeroth-order method that originally came from quantum computing called the parameter-shift rule, which has used a lesser number of parameters than previous methods.
Similar Papers
Sample-Efficient Bayesian Transfer Learning for Online Machine Parameter Optimization
Machine Learning (CS)
Finds best machine settings faster, saving money.
From Data to Uncertainty Sets: a Machine Learning Approach
Machine Learning (CS)
Protects important rules from computer prediction errors.
Zeroth-Order Optimization Finds Flat Minima
Machine Learning (CS)
Finds better answers when computers can't see inside.