Score: 0

Randomized subspace correction methods for convex optimization

Published: July 2, 2025 | arXiv ID: 2507.01415v1

By: Boou Jiang, Jongho Park, Jinchao Xu

Potential Business Impact:

Solves hard math problems faster by breaking them up.

Business Areas:
A/B Testing Data and Analytics

This paper introduces an abstract framework for randomized subspace correction methods for convex optimization, which unifies and generalizes a broad class of existing algorithms, including domain decomposition, multigrid, and block coordinate descent methods. We provide a convergence rate analysis ranging from minimal assumptions to more practical settings, such as sharpness and strong convexity. While most existing studies on block coordinate descent methods focus on nonoverlapping decompositions and smooth or strongly convex problems, our framework extends to more general settings involving arbitrary space decompositions, inexact local solvers, and problems with limited smoothness or convexity. The proposed framework is broadly applicable to convex optimization problems arising in areas such as nonlinear partial differential equations, imaging, and data science.

Country of Origin
πŸ‡ΈπŸ‡¦ Saudi Arabia

Page Count
21 pages

Category
Mathematics:
Optimization and Control