Decentralized Online Riemannian Optimization Beyond Hadamard Manifolds
By: Emre Sahinoglu, Shahin Shahrampour
Potential Business Impact:
Makes smart machines learn better on curved paths.
We study decentralized online Riemannian optimization over manifolds with possibly positive curvature, going beyond the Hadamard manifold setting. Decentralized optimization techniques rely on a consensus step that is well understood in Euclidean spaces because of their linearity. However, in positively curved Riemannian spaces, a main technical challenge is that geodesic distances may not induce a globally convex structure. In this work, we first analyze a curvature-aware Riemannian consensus step that enables a linear convergence beyond Hadamard manifolds. Building on this step, we establish a $O(\sqrt{T})$ regret bound for the decentralized online Riemannian gradient descent algorithm. Then, we investigate the two-point bandit feedback setup, where we employ computationally efficient gradient estimators using smoothing techniques, and we demonstrate the same $O(\sqrt{T})$ regret bound through the subconvexity analysis of smoothed objectives.
Similar Papers
Online Optimization on Hadamard Manifolds: Curvature Independent Regret Bounds on Horospherically Convex Objectives
Machine Learning (CS)
Improves math for computers on curved surfaces.
Distributed Stochastic Proximal Algorithm on Riemannian Submanifolds for Weakly-convex Functions
Optimization and Control
Helps robots learn to work together better.
Riemannian Optimization for Distance Geometry: A Study of Convergence, Robustness, and Incoherence
Optimization and Control
Finds hidden shapes from incomplete distance clues.