Function-on-Function Bayesian Optimization
By: Jingru Huang , Haijie Xu , Manrui Jiang and more
Potential Business Impact:
Finds best settings for complex computer programs.
Bayesian optimization (BO) has been widely used to optimize expensive and gradient-free objective functions across various domains. However, existing BO methods have not addressed the objective where both inputs and outputs are functions, which increasingly arise in complex systems as advanced sensing technologies. To fill this gap, we propose a novel function-on-function Bayesian optimization (FFBO) framework. Specifically, we first introduce a function-on-function Gaussian process (FFGP) model with a separable operator-valued kernel to capture the correlations between function-valued inputs and outputs. Compared to existing Gaussian process models, FFGP is modeled directly in the function space. Based on FFGP, we define a scalar upper confidence bound (UCB) acquisition function using a weighted operator-based scalarization strategy. Then, a scalable functional gradient ascent algorithm (FGA) is developed to efficiently identify the optimal function-valued input. We further analyze the theoretical properties of the proposed method. Extensive experiments on synthetic and real-world data demonstrate the superior performance of FFBO over existing approaches.
Similar Papers
Bayesian Optimization for Function-Valued Responses under Min-Max Criteria
Machine Learning (CS)
Finds the worst possible mistakes to fix them.
FigBO: A Generalized Acquisition Function Framework with Look-Ahead Capability for Bayesian Optimization
Machine Learning (CS)
Finds best answers faster by looking ahead.
On the Implementation of a Bayesian Optimization Framework for Interconnected Systems
Machine Learning (Stat)
Finds best answers faster by using known parts.