Score: 0

Communication-Efficient Distributed Asynchronous ADMM

Published: August 17, 2025 | arXiv ID: 2508.12233v1

By: Sagar Shrestha

Potential Business Impact:

Shrinks data to speed up computer learning.

In distributed optimization and federated learning, asynchronous alternating direction method of multipliers (ADMM) serves as an attractive option for large-scale optimization, data privacy, straggler nodes and variety of objective functions. However, communication costs can become a major bottleneck when the nodes have limited communication budgets or when the data to be communicated is prohibitively large. In this work, we propose introducing coarse quantization to the data to be exchanged in aynchronous ADMM so as to reduce communication overhead for large-scale federated learning and distributed optimization applications. We experimentally verify the convergence of the proposed method for several distributed learning tasks, including neural networks.

Country of Origin
🇺🇸 United States

Page Count
10 pages

Category
Computer Science:
Machine Learning (CS)