Dual Acceleration for Minimax Optimization: Linear Convergence Under Relaxed Assumptions
By: Jingwang Li, Xiao Li
Potential Business Impact:
Solves hard math problems faster and more reliably.
This paper addresses the bilinearly coupled minimax optimization problem: $\min_{x \in \mathbb{R}^{d_x}}\max_{y \in \mathbb{R}^{d_y}} \ f_1(x) + f_2(x) + y^{\top} Bx - g_1(y) - g_2(y)$, where $f_1$ and $g_1$ are smooth convex functions, $f_2$ and $g_2$ are potentially nonsmooth convex functions, and $B$ is a coupling matrix. Existing algorithms for solving this problem achieve linear convergence only under stronger conditions, which may not be met in many scenarios. We first introduce the Primal-Dual Proximal Gradient (PDPG) method and demonstrate that it converges linearly under an assumption where existing algorithms fail to achieve linear convergence. Building on insights gained from analyzing the convergence conditions of existing algorithms and PDPG, we further propose the inexact Dual Accelerated Proximal Gradient (iDAPG) method. This method achieves linear convergence under weaker conditions than those required by existing approaches. Moreover, even in cases where existing methods guarantee linear convergence, iDAPG can still provide superior theoretical performance in certain scenarios.
Similar Papers
Accelerated Decentralized Constraint-Coupled Optimization: A Dual$^2$ Approach
Optimization and Control
Helps many computers work together faster.
Anderson Accelerated Primal-Dual Hybrid Gradient for solving LP
Optimization and Control
Solves math problems much faster.
Improved Rates of Differentially Private Nonconvex-Strongly-Concave Minimax Optimization
Machine Learning (CS)
Makes AI learn private data without seeing it.