An accelerated randomized Bregman-Kaczmarz method for strongly convex linearly constraint optimization
By: Lionel Tondji, Dirk A. Lorenz, Ion Necoara
Potential Business Impact:
Makes computer math problems solve much faster.
In this paper, we propose a randomized accelerated method for the minimization of a strongly convex function under linear constraints. The method is of Kaczmarz-type, i.e. it only uses a single linear equation in each iteration. To obtain acceleration we build on the fact that the Kaczmarz method is dual to a coordinate descent method. We use a recently proposed acceleration method for the randomized coordinate descent and transfer it to the primal space. This method inherits many of the attractive features of the accelerated coordinate descent method, including its worst-case convergence rates. A theoretical analysis of the convergence of the proposed method is given. Numerical experiments show that the proposed method is more efficient and faster than the existing methods for solving the same problem.
Similar Papers
Bregman Linearized Augmented Lagrangian Method for Nonconvex Constrained Stochastic Zeroth-order Optimization
Optimization and Control
Solves hard computer problems faster with less guessing.
Randomized Kaczmarz Methods with Beyond-Krylov Convergence
Numerical Analysis
Solves hard math problems much faster.
A federated Kaczmarz algorithm
Numerical Analysis
Solves big math problems faster on many computers.