ROOT: Rethinking Offline Optimization as Distributional Translation via Probabilistic Bridge
By: Manh Cuong Dao , The Hung Tran , Phi Le Nguyen and more
Potential Business Impact:
Finds best answers with little information.
This paper studies the black-box optimization task which aims to find the maxima of a black-box function using a static set of its observed input-output pairs. This is often achieved via learning and optimizing a surrogate function with that offline data. Alternatively, it can also be framed as an inverse modeling task that maps a desired performance to potential input candidates that achieve it. Both approaches are constrained by the limited amount of offline data. To mitigate this limitation, we introduce a new perspective that casts offline optimization as a distributional translation task. This is formulated as learning a probabilistic bridge transforming an implicit distribution of low-value inputs (i.e., offline data) into another distribution of high-value inputs (i.e., solution candidates). Such probabilistic bridge can be learned using low- and high-value inputs sampled from synthetic functions that resemble the target function. These synthetic functions are constructed as the mean posterior of multiple Gaussian processes fitted with different parameterizations on the offline data, alleviating the data bottleneck. The proposed approach is evaluated on an extensive benchmark comprising most recent methods, demonstrating significant improvement and establishing a new state-of-the-art performance.
Similar Papers
Learning Surrogates for Offline Black-Box Optimization via Gradient Matching
Machine Learning (CS)
Finds best designs without costly real tests.
Offline Model-Based Optimization: Comprehensive Review
Machine Learning (CS)
Finds best designs from old data.
Incorporating Surrogate Gradient Norm to Improve Offline Optimization Techniques
Machine Learning (CS)
Makes computer guesses better without real-world tests.