Model-Driven Subspaces for Large-Scale Optimization with Local Approximation Strategy
By: Yitong He, Pengcheng Xie
Potential Business Impact:
Finds best answers to hard problems faster.
Solving large-scale optimization problems is a bottleneck and is very important for machine learning and multiple kinds of scientific problems. Subspace-based methods using the local approximation strategy are one of the most important methods. This paper discusses different and novel kinds of advanced subspaces for such methods and presents a new algorithm with such subspaces, called MD-LAMBO. Theoretical analysis including the subspaces' properties, sufficient function value decrease, and global convergence is given for the new algorithm. The related model construction on the subspaces is given under derivative-free settings. In numerical results, performance profiles, and truncated Newton step errors of MD-LAMBO using different model-driven subspaces are provided, which show subspace-dependent numerical differences and advantages of our methods and subspaces.
Similar Papers
Active Subspaces in Infinite Dimension
Machine Learning (Stat)
Simplifies hard math problems for computers.
Deep Learning for Subspace Regression
Machine Learning (CS)
Teaches computers to guess answers for complex problems.
Taming High-Dimensional Dynamics: Learning Optimal Projections onto Spectral Submanifolds
Systems and Control
Makes robots move more accurately and smoothly.