Score: 2

Lightweight Federated Learning over Wireless Edge Networks

Published: July 13, 2025 | arXiv ID: 2507.09546v1

By: Xiangwang Hou , Jingjing Wang , Jun Du and more

Potential Business Impact:

Makes smart devices learn without sending private data.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

With the exponential growth of smart devices connected to wireless networks, data production is increasing rapidly, requiring machine learning (ML) techniques to unlock its value. However, the centralized ML paradigm raises concerns over communication overhead and privacy. Federated learning (FL) offers an alternative at the network edge, but practical deployment in wireless networks remains challenging. This paper proposes a lightweight FL (LTFL) framework integrating wireless transmission power control, model pruning, and gradient quantization. We derive a closed-form expression of the FL convergence gap, considering transmission error, model pruning error, and gradient quantization error. Based on these insights, we formulate an optimization problem to minimize the convergence gap while meeting delay and energy constraints. To solve the non-convex problem efficiently, we derive closed-form solutions for the optimal model pruning ratio and gradient quantization level, and employ Bayesian optimization for transmission power control. Extensive experiments on real-world datasets show that LTFL outperforms state-of-the-art schemes.

Country of Origin
πŸ‡¨πŸ‡³ πŸ‡ΈπŸ‡¬ China, Singapore

Page Count
16 pages

Category
Computer Science:
Distributed, Parallel, and Cluster Computing