Score: 0

Distributed Stochastic Zeroth-Order Optimization with Compressed Communication

Published: March 21, 2025 | arXiv ID: 2503.17429v3

By: Youqing Hua , Shuai Liu , Yiguang Hong and more

Potential Business Impact:

Helps computers learn without seeing all the data.

Business Areas:
A/B Testing Data and Analytics

The dual challenges of prohibitive communication overhead and the impracticality of gradient computation due to data privacy or black-box constraints in distributed systems motivate this work on communication-constrained gradient-free optimization. We propose a stochastic distributed zeroth-order algorithm (Com-DSZO) requiring only two function evaluations per iteration, integrated with general compression operators. Rigorous analysis establishes its sublinear convergence rate for both smooth and nonsmooth objectives, while explicitly elucidating the compression-convergence trade-off. Furthermore, we develop a variance-reduced variant (VR-Com-DSZO) under stochastic mini-batch feedback. The empirical algorithm performance are illustrated with numerical examples.

Page Count
10 pages

Category
Mathematics:
Optimization and Control