Solving Convex-Concave Problems with $\tilde{\mathcal{O}}(ε^{-4/7})$ Second-Order Oracle Complexity
By: Lesi Chen , Chengchang Liu , Luo Luo and more
Potential Business Impact:
Solves hard math problems faster using new computer tricks.
Previous algorithms can solve convex-concave minimax problems $\min_{x \in \mathcal{X}} \max_{y \in \mathcal{Y}} f(x,y)$ with $\mathcal{O}(\epsilon^{-2/3})$ second-order oracle calls using Newton-type methods. This result has been speculated to be optimal because the upper bound is achieved by a natural generalization of the optimal first-order method. In this work, we show an improved upper bound of $\tilde{\mathcal{O}}(\epsilon^{-4/7})$ by generalizing the optimal second-order method for convex optimization to solve the convex-concave minimax problem. We further apply a similar technique to lazy Hessian algorithms and show that our proposed algorithm can also be seen as a second-order ``Catalyst'' framework (Lin et al., JMLR 2018) that could accelerate any globally convergent algorithms for solving minimax problems.
Similar Papers
Lower Complexity Bounds for Nonconvex-Strongly-Convex Bilevel Optimization with First-Order Oracles
Machine Learning (CS)
Makes solving tricky math problems much faster.
On the Condition Number Dependency in Bilevel Optimization
Optimization and Control
Finds better ways to solve tricky math problems.
Improving Online-to-Nonconvex Conversion for Smooth Optimization via Double Optimism
Optimization and Control
Finds better solutions to hard math problems faster.