Demo: A Practical Testbed for Decentralized Federated Learning on Physical Edge Devices
By: Chao Feng , Nicolas Huber , Alberto Huertas Celdran and more
Potential Business Impact:
Trains computers together without sharing private info.
Federated Learning (FL) enables collaborative model training without sharing raw data, preserving participant privacy. Decentralized FL (DFL) eliminates reliance on a central server, mitigating the single point of failure inherent in the traditional FL paradigm, while introducing deployment challenges on resource-constrained devices. To evaluate real-world applicability, this work designs and deploys a physical testbed using edge devices such as Raspberry Pi and Jetson Nano. The testbed is built upon a DFL training platform, NEBULA, and extends it with a power monitoring module to measure energy consumption during training. Experiments across multiple datasets show that model performance is influenced by the communication topology, with denser topologies leading to better outcomes in DFL settings.
Similar Papers
Performance Analysis of Decentralized Federated Learning Deployments
Machine Learning (CS)
Helps phones learn together without a boss.
DFPL: Decentralized Federated Prototype Learning Across Heterogeneous Data Distributions
Distributed, Parallel, and Cluster Computing
Helps computers learn together without sharing private data.
Optimizing Federated Learning for Scalable Power-demand Forecasting in Microgrids
Distributed, Parallel, and Cluster Computing
Learns energy use without sharing private data.