Linear convergence of a one-cut conditional gradient method for total variation regularization
By: Giacomo Cristinelli, José A. Iglesias, Daniel Walter
Potential Business Impact:
Makes computer images clearer with less data.
We introduce a fully-corrective generalized conditional gradient method for convex minimization problems involving total variation regularization on multidimensional domains. It relies on alternating between updating an active set of subsets of the spatial domain as well as of an iterate given by a conic combination of the associated characteristic functions. Different to previous approaches in the same spirit, the computation of a new candidate set only requires the solution of one prescribed mean curvature problem instead of the resolution of a fractional minimization task analogous to finding a generalized Cheeger set. After discretization, the former can be realized by a single run of a graph cut algorithm leading to significant speedup in practice. We prove the global sublinear convergence of the resulting method, under mild assumptions, and its asymptotic linear convergence in a more restrictive two-dimensional setting which uses results of stability of surfaces of prescribed curvature under perturbations of the curvature. Finally, we numerically demonstrate this convergence behavior in some model PDE-constrained minimization problems.
Similar Papers
Unrolling Nonconvex Graph Total Variation for Image Denoising
Image and Video Processing
Cleans up blurry pictures better than before.
Tight Convergence Rates in Gradient Mapping for the Difference-of-Convex Algorithm
Optimization and Control
Improves how computers solve tricky math problems.
A Graphical Global Optimization Framework for Parameter Estimation of Statistical Models with Nonconvex Regularization Functions
Optimization and Control
Solves hard math puzzles for computers faster.