Distributed Stochastic Zeroth-Order Optimization with Compressed Communication
By: Youqing Hua , Shuai Liu , Yiguang Hong and more
Potential Business Impact:
Helps computers learn without seeing all the data.
The dual challenges of prohibitive communication overhead and the impracticality of gradient computation due to data privacy or black-box constraints in distributed systems motivate this work on communication-constrained gradient-free optimization. We propose a stochastic distributed zeroth-order algorithm (Com-DSZO) requiring only two function evaluations per iteration, integrated with general compression operators. Rigorous analysis establishes its sublinear convergence rate for both smooth and nonsmooth objectives, while explicitly elucidating the compression-convergence trade-off. Furthermore, we develop a variance-reduced variant (VR-Com-DSZO) under stochastic mini-batch feedback. The empirical algorithm performance are illustrated with numerical examples.
Similar Papers
Decentralized Optimization with Amplified Privacy via Efficient Communication
Systems and Control
Keeps secret messages safe while learning.
Communication-Efficient Distributed Online Nonconvex Optimization with Time-Varying Constraints
Optimization and Control
Helps robots learn tasks with less data.
Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity
Optimization and Control
Makes computers work better with bad data.