Score: 0

Jointly Computation- and Communication-Efficient Distributed Learning

Published: August 21, 2025 | arXiv ID: 2508.15509v1

By: Xiaoxing Ren , Nicola Bastianello , Karl H. Johansson and more

Potential Business Impact:

Makes computers learn together faster and with less data.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

We address distributed learning problems over undirected networks. Specifically, we focus on designing a novel ADMM-based algorithm that is jointly computation- and communication-efficient. Our design guarantees computational efficiency by allowing agents to use stochastic gradients during local training. Moreover, communication efficiency is achieved as follows: i) the agents perform multiple training epochs between communication rounds, and ii) compressed transmissions are used. We prove exact linear convergence of the algorithm in the strongly convex setting. We corroborate our theoretical results by numerical comparisons with state of the art techniques on a classification task.

Country of Origin
πŸ‡ΈπŸ‡ͺ Sweden

Page Count
8 pages

Category
Computer Science:
Machine Learning (CS)