Mean-square and linear convergence of a stochastic proximal point algorithm in metric spaces of nonpositive curvature
By: Nicholas Pischke
Potential Business Impact:
Helps computers find answers faster in complex math.
We define a stochastic variant of the proximal point algorithm in the general setting of nonlinear (separable) Hadamard spaces for approximating zeros of the mean of a stochastically perturbed monotone vector field and prove its convergence under a suitable strong monotonicity assumption, together with a probabilistic independence assumption and a separability assumption on the tangent spaces. As a particular case, our results transfer previous work by P. Bianchi on that method in Hilbert spaces for the first time to Hadamard manifolds. Moreover, our convergence proof is fully effective and allows for the construction of explicit rates of convergence for the iteration towards the (unique) solution both in mean and almost surely. These rates are moreover highly uniform, being independent of most data surrounding the iteration, space or distribution. In that generality, these rates are novel already in the context of Hilbert spaces. Linear nonasymptotic guarantees under additional second-moment conditions on the Yosida approximates and special cases of stochastic convex minimization are discussed.
Similar Papers
Distributed Stochastic Proximal Algorithm on Riemannian Submanifolds for Weakly-convex Functions
Optimization and Control
Helps robots learn to work together better.
Accelerated stochastic first-order method for convex optimization under heavy-tailed noise
Optimization and Control
Makes computer learning faster with messy data.
Quantitative Convergence Analysis of Projected Stochastic Gradient Descent for Non-Convex Losses via the Goldstein Subdifferential
Optimization and Control
Makes AI learn faster without needing extra tricks.