Score: 1

Accelerated stochastic first-order method for convex optimization under heavy-tailed noise

Published: October 13, 2025 | arXiv ID: 2510.11676v1

By: Chuan He, Zhaosong Lu

Potential Business Impact:

Makes computer learning faster with messy data.

Business Areas:
A/B Testing Data and Analytics

We study convex composite optimization problems, where the objective function is given by the sum of a prox-friendly function and a convex function whose subgradients are estimated under heavy-tailed noise. Existing work often employs gradient clipping or normalization techniques in stochastic first-order methods to address heavy-tailed noise. In this paper, we demonstrate that a vanilla stochastic algorithm -- without additional modifications such as clipping or normalization -- can achieve optimal complexity for these problems. In particular, we establish that an accelerated stochastic proximal subgradient method achieves a first-order oracle complexity that is universally optimal for smooth, weakly smooth, and nonsmooth convex optimization, as well as for stochastic convex optimization under heavy-tailed noise. Numerical experiments are further provided to validate our theoretical results.

Country of Origin
πŸ‡ΈπŸ‡ͺ πŸ‡ΊπŸ‡Έ Sweden, United States

Page Count
21 pages

Category
Mathematics:
Optimization and Control