Anderson Accelerated Primal-Dual Hybrid Gradient for solving LP
By: Yingxin Zhou, Stefano Cipolla, Phan Tu Vuong
Potential Business Impact:
Solves math problems much faster.
We present the Anderson Accelerated Primal-Dual Hybrid Gradient (AA-PDHG), a fixed-point-based framework designed to overcome the slow convergence of the standard PDHG method for the solution of linear programming (LP) problems. We establish the global convergence of AA-PDHG under a safeguard condition. In addition, we propose a filtered variant (FAA-PDHG) that applies angle and length filtering to preserve the uniform boundedness of the coefficient matrix, a property crucial for guaranteeing convergence. Numerical results show that both AA-PDHG and FAA-PDHG deliver significant speedups over vanilla PDHG for large-scale LP instances.
Similar Papers
Dual Acceleration for Minimax Optimization: Linear Convergence Under Relaxed Assumptions
Optimization and Control
Solves hard math problems faster and more reliably.
An accelerated primal-dual gradient flow for linearly constrained multiobjective optimization
Optimization and Control
Solves many hard problems with many goals at once.
GPU-Accelerated Primal Heuristics for Mixed Integer Programming
Optimization and Control
Makes computers solve hard problems much faster.