Smoothness of the Augmented Lagrangian Dual in Convex Optimization
By: Jingwang Li, Vincent Lau
Potential Business Impact:
Makes math problems with limits easier to solve.
This paper investigates the general linearly constrained optimization problem: $\min_{x \in \R^d} f(x) \ \st \ A x = b$, where $f: \R^n \rightarrow \exs$ is a closed proper convex function, $A \in \R^{p \times d}$, and $b \in \R^p$. We establish the following results without requiring additional regularity conditions: (1) the augmented Lagrangian dual function $\phi_{\rho}(\lambda) = \inf_x \cL_{\rho}(x, \lambda)$ is $\frac{1}{\rho}$-smooth everywhere; and (2) the solution to $\min_{x \in \R^d} \cL_{\rho}(x, \lambda)$ exists for any dual variable $\lambda \in \R^p$, where $\rho > 0$ is the augmented parameter and $\cL_{\rho}(x, \lambda) = f(x) + \dotprod{\lambda, A x - b} + \frac{\rho}{2}\norm{A x - b}^2$ is the augmented Lagrangian. These findings significantly relax the strong assumptions commonly imposed in existing literature to guarantee similar properties.
Similar Papers
Stochastic Smoothed Primal-Dual Algorithms for Nonconvex Optimization with Linear Inequality Constraints
Optimization and Control
Makes computer learning faster and more accurate.
Bregman Linearized Augmented Lagrangian Method for Nonconvex Constrained Stochastic Zeroth-order Optimization
Optimization and Control
Solves hard computer problems faster with less guessing.
Sharp bounds in perturbed smooth optimization
Optimization and Control
Makes computer math problems more predictable.