CoCoL: A Communication Efficient Decentralized Collaborative Method for Multi-Robot Systems
By: Jiaxi Huang , Yan Huang , Yixian Zhao and more
Potential Business Impact:
Robots learn together using less talking.
Collaborative learning enhances the performance and adaptability of multi-robot systems in complex tasks but faces significant challenges due to high communication overhead and data heterogeneity inherent in multi-robot tasks. To this end, we propose CoCoL, a Communication efficient decentralized Collaborative Learning method tailored for multi-robot systems with heterogeneous local datasets. Leveraging a mirror descent framework, CoCoL achieves remarkable communication efficiency with approximate Newton-type updates by capturing the similarity between objective functions of robots, and reduces computational costs through inexact sub-problem solutions. Furthermore, the integration of a gradient tracking scheme ensures its robustness against data heterogeneity. Experimental results on three representative multi robot collaborative learning tasks show the superiority of the proposed CoCoL in significantly reducing both the number of communication rounds and total bandwidth consumption while maintaining state-of-the-art accuracy. These benefits are particularly evident in challenging scenarios involving non-IID (non-independent and identically distributed) data distribution, streaming data, and time-varying network topologies.
Similar Papers
Collaborative-Online-Learning-Enabled Distributionally Robust Motion Control for Multi-Robot Systems
Optimization and Control
Robots learn to avoid bumping into things.
Cross-region Model Training with Communication-Computation Overlapping and Delay Compensation
Distributed, Parallel, and Cluster Computing
Makes AI learn faster across far-away computers.
LangCoop: Collaborative Driving with Language
Robotics
Cars talk to each other using simple words.