Score: 0

VFOG: Variance-Reduced Fast Optimistic Gradient Methods for a Class of Nonmonotone Generalized Equations

Published: August 22, 2025 | arXiv ID: 2508.16791v1

By: Quoc Tran-Dinh, Nghia Nguyen-Trung

Potential Business Impact:

Makes computer learning faster and more accurate.

Business Areas:
A/B Testing Data and Analytics

We develop a novel optimistic gradient-type algorithmic framework, combining both Nesterov's acceleration and variance-reduction techniques, to solve a class of generalized equations involving possibly nonmonotone operators in data-driven applications. Our framework covers a wide class of stochastic variance-reduced schemes, including mini-batching, and control variate unbiased and biased estimators. We establish that our method achieves $\mathcal{O}(1/k^2)$ convergence rates in expectation on the squared norm of residual under the Lipschitz continuity and a ``co-hypomonotonicity-type'' assumptions, improving upon non-accelerated counterparts by a factor of $1/k$. We also prove faster $o(1/k^2)$ convergence rates, both in expectation and almost surely. In addition, we show that the sequence of iterates of our method almost surely converges to a solution of the underlying problem. We demonstrate the applicability of our method using general error bound criteria, covering mini-batch stochastic estimators as well as three well-known control variate estimators: loopless SVRG, SAGA, and loopless SARAH, for which the last three variants attain significantly better oracle complexity compared to existing methods. We validate our framework and theoretical results through two numerical examples. The preliminary results illustrate promising performance of our accelerated method over its non-accelerated counterparts.

Country of Origin
🇺🇸 United States

Page Count
54 pages

Category
Mathematics:
Optimization and Control