Optimization Performance of Factorization Machine with Annealing under Limited Training Data
By: Mayumi Nakano , Yuya Seki , Shuta Kikuchi and more
Potential Business Impact:
Finds best answers faster by focusing on new info.
Black-box (BB) optimization problems aim to identify an input that minimizes the output of a function (the BB function) whose input-output relationship is unknown. Factorization machine with annealing (FMA) is a promising approach to this task, employing a factorization machine (FM) as a surrogate model to iteratively guide the solution search via an Ising machine. Although FMA has demonstrated strong optimization performance across various applications, its performance often stagnates as the number of optimization iterations increases. One contributing factor to this stagnation is the growing number of data points in the dataset used to train FM. It is hypothesized that as more data points are accumulated, the contribution of newly added data points becomes diluted within the entire dataset, thereby reducing their impact on improving the prediction accuracy of FM. To address this issue, we propose a novel method for sequential dataset construction that retains at most a specified number of the most recently added data points. This strategy is designed to enhance the influence of newly added data points on the surrogate model. Numerical experiments demonstrate that the proposed FMA achieves lower-cost solutions with fewer BB function evaluations compared to the conventional FMA.
Similar Papers
Subsampling Factorization Machine Annealing
Quantum Physics
Solves hard problems faster and more accurately.
Extended Factorization Machine Annealing for Rapid Discovery of Transparent Conducting Materials
Materials Science
Finds better materials for screens and solar panels.
Annealed Mean Field Descent Is Highly Effective for Quadratic Unconstrained Binary Optimization
Optimization and Control
Finds best answers to tough problems faster.