Score: 0

Accelerated Distributed Optimization with Compression and Error Feedback

Published: March 11, 2025 | arXiv ID: 2503.08427v2

By: Yuan Gao , Anton Rodomanov , Jeremy Rack and more

Potential Business Impact:

Speeds up computer learning with less data sent.

Business Areas:
A/B Testing Data and Analytics

Modern machine learning tasks often involve massive datasets and models, necessitating distributed optimization algorithms with reduced communication overhead. Communication compression, where clients transmit compressed updates to a central server, has emerged as a key technique to mitigate communication bottlenecks. However, the theoretical understanding of stochastic distributed optimization with contractive compression remains limited, particularly in conjunction with Nesterov acceleration -- a cornerstone for achieving faster convergence in optimization. In this paper, we propose a novel algorithm, ADEF (Accelerated Distributed Error Feedback), which integrates Nesterov acceleration, contractive compression, error feedback, and gradient difference compression. We prove that ADEF achieves the first accelerated convergence rate for stochastic distributed optimization with contractive compression in the general convex regime. Numerical experiments validate our theoretical findings and demonstrate the practical efficacy of ADEF in reducing communication costs while maintaining fast convergence.

Page Count
33 pages

Category
Mathematics:
Optimization and Control