Jointly Computation- and Communication-Efficient Distributed Learning
By: Xiaoxing Ren , Nicola Bastianello , Karl H. Johansson and more
Potential Business Impact:
Makes computers learn together faster and with less data.
We address distributed learning problems over undirected networks. Specifically, we focus on designing a novel ADMM-based algorithm that is jointly computation- and communication-efficient. Our design guarantees computational efficiency by allowing agents to use stochastic gradients during local training. Moreover, communication efficiency is achieved as follows: i) the agents perform multiple training epochs between communication rounds, and ii) compressed transmissions are used. We prove exact linear convergence of the algorithm in the strongly convex setting. We corroborate our theoretical results by numerical comparisons with state of the art techniques on a classification task.
Similar Papers
Communication-Efficient Distributed Asynchronous ADMM
Machine Learning (CS)
Shrinks data to speed up computer learning.
Learning to accelerate distributed ADMM using graph neural networks
Machine Learning (CS)
Learns faster ways for computers to solve big problems.
Modular Distributed Nonconvex Learning with Error Feedback
Optimization and Control
Makes computers learn faster with less data.