Score: 1

Dual Acceleration for Minimax Optimization: Linear Convergence Under Relaxed Assumptions

Published: May 4, 2025 | arXiv ID: 2505.02115v2

By: Jingwang Li, Xiao Li

Potential Business Impact:

Solves hard math problems faster and more reliably.

Business Areas:
A/B Testing Data and Analytics

This paper addresses the bilinearly coupled minimax optimization problem: $\min_{x \in \mathbb{R}^{d_x}}\max_{y \in \mathbb{R}^{d_y}} \ f_1(x) + f_2(x) + y^{\top} Bx - g_1(y) - g_2(y)$, where $f_1$ and $g_1$ are smooth convex functions, $f_2$ and $g_2$ are potentially nonsmooth convex functions, and $B$ is a coupling matrix. Existing algorithms for solving this problem achieve linear convergence only under stronger conditions, which may not be met in many scenarios. We first introduce the Primal-Dual Proximal Gradient (PDPG) method and demonstrate that it converges linearly under an assumption where existing algorithms fail to achieve linear convergence. Building on insights gained from analyzing the convergence conditions of existing algorithms and PDPG, we further propose the inexact Dual Accelerated Proximal Gradient (iDAPG) method. This method achieves linear convergence under weaker conditions than those required by existing approaches. Moreover, even in cases where existing methods guarantee linear convergence, iDAPG can still provide superior theoretical performance in certain scenarios.

Country of Origin
🇭🇰 Hong Kong

Page Count
17 pages

Category
Mathematics:
Optimization and Control