Communication-Learning Co-Design for Differentially Private Over-the-Air Federated Distillation
By: Zihao Hu, Jia Yan, Ying-Jun Angela Zhang
Potential Business Impact:
Lets phones learn together without sharing private data.
The ever-growing learning model size nowadays challenges the communication efficiency and privacy preservation of the traditional federated learning (FL). In this paper, we propose a novel differentially private (DP) over-the-air federated distillation (FD) framework, where wireless devices (WDs) periodically share noise-perturbed model outputs with the parameter server by harnessing the superposition property of multi-access channels. Accordingly, over-the-air FD enables the shared responsibility of the DP preservation on the low-dimensional disclosed signals among WDs. We study the communication-learning co-design problem in differentially private over-the-air FD, aiming to maximize the learning convergence rate while meeting the transmit power and DP requirements of WDs. The main challenge is rooted in the intractable learning and privacy analysis in over-the-air FD, together with the strong coupling among the decision variables spanning two timescales. To tackle this problem, we first derive the analytical learning convergence rate and privacy losses of WDs, based on which the optimal transceiver design per FD round and long-term training rounds decision are obtained in the closed forms. Numerical results demonstrate that the proposed differentially private over-the-air FD approach achieves a better learning-privacy trade-off with largely-reduced communication overhead than the conventional FL benchmarks.
Similar Papers
Differential Privacy as a Perk: Federated Learning over Multiple-Access Fading Channels with a Multi-Antenna Base Station
Machine Learning (CS)
Keeps private data safe while learning.
Differential Privacy as a Perk: Federated Learning over Multiple-Access Fading Channels with a Multi-Antenna Base Station
Machine Learning (CS)
Keeps your data private while computers learn.
Communication-Aware Knowledge Distillation for Federated LLM Fine-Tuning over Wireless Networks
Machine Learning (CS)
Lets AI learn together without sharing secrets.