Solving bilevel optimization via sequential minimax optimization
By: Zhaosong Lu, Sanyou Mei
Potential Business Impact:
Solves tricky math problems faster than before.
In this paper we propose a sequential minimax optimization (SMO) method for solving a class of constrained bilevel optimization problems in which the lower-level part is a possibly nonsmooth convex optimization problem, while the upper-level part is a possibly nonconvex optimization problem. Specifically, SMO applies a first-order method to solve a sequence of minimax subproblems, which are obtained by employing a hybrid of modified augmented Lagrangian and penalty schemes on the bilevel optimization problems. Under suitable assumptions, we establish an operation complexity of $O(\varepsilon^{-7}\log\varepsilon^{-1})$ and $O(\varepsilon^{-6}\log\varepsilon^{-1})$, measured in terms of fundamental operations, for SMO in finding an $\varepsilon$-KKT solution of the bilevel optimization problems with merely convex and strongly convex lower-level objective functions, respectively. The latter result improves the previous best-known operation complexity by a factor of $\varepsilon^{-1}$. Preliminary numerical results demonstrate significantly superior computational performance compared to the recently developed first-order penalty method.
Similar Papers
Lower Complexity Bounds for Nonconvex-Strongly-Convex Bilevel Optimization with First-Order Oracles
Machine Learning (CS)
Makes solving tricky math problems much faster.
Stochastic Bilevel Optimization with Heavy-Tailed Noise
Machine Learning (CS)
Teaches computers to learn better from messy data.
Faster Gradient Methods for Highly-smooth Stochastic Bilevel Optimization
Optimization and Control
Makes smart computer learning faster and better.