Randomized subspace correction methods for convex optimization
By: Boou Jiang, Jongho Park, Jinchao Xu
Potential Business Impact:
Solves hard math problems faster by breaking them up.
This paper introduces an abstract framework for randomized subspace correction methods for convex optimization, which unifies and generalizes a broad class of existing algorithms, including domain decomposition, multigrid, and block coordinate descent methods. We provide a convergence rate analysis ranging from minimal assumptions to more practical settings, such as sharpness and strong convexity. While most existing studies on block coordinate descent methods focus on nonoverlapping decompositions and smooth or strongly convex problems, our framework extends to more general settings involving arbitrary space decompositions, inexact local solvers, and problems with limited smoothness or convexity. The proposed framework is broadly applicable to convex optimization problems arising in areas such as nonlinear partial differential equations, imaging, and data science.
Similar Papers
Connections between convex optimization algorithms and subspace correction methods
Optimization and Control
Unlocks new math tricks for faster computer problem-solving.
An accelerated randomized Bregman-Kaczmarz method for strongly convex linearly constraint optimization
Optimization and Control
Makes computer math problems solve much faster.
Randomized block Krylov method for approximation of truncated tensor SVD
Numerical Analysis
Makes big data smaller for computers.