Personalized Federated Learning with Bidirectional Communication Compression via One-Bit Random Sketching
By: Jiacheng Cheng , Xu Zhang , Guanghui Qiu and more
Potential Business Impact:
Makes phones learn together without sharing private data.
Federated Learning (FL) enables collaborative training across decentralized data, but faces key challenges of bidirectional communication overhead and client-side data heterogeneity. To address communication costs while embracing data heterogeneity, we propose pFed1BS, a novel personalized federated learning framework that achieves extreme communication compression through one-bit random sketching. In personalized FL, the goal shifts from training a single global model to creating tailored models for each client. In our framework, clients transmit highly compressed one-bit sketches, and the server aggregates and broadcasts a global one-bit consensus. To enable effective personalization, we introduce a sign-based regularizer that guides local models to align with the global consensus while preserving local data characteristics. To mitigate the computational burden of random sketching, we employ the Fast Hadamard Transform for efficient projection. Theoretical analysis guarantees that our algorithm converges to a stationary neighborhood of the global potential function. Numerical simulations demonstrate that pFed1BS substantially reduces communication costs while achieving competitive performance compared to advanced communication-efficient FL algorithms.
Similar Papers
pFedDSH: Enabling Knowledge Transfer in Personalized Federated Learning through Data-free Sub-Hypernetwork
Machine Learning (CS)
Helps AI learn from new users without seeing their private info.
FedBiF: Communication-Efficient Federated Learning via Bits Freezing
Machine Learning (CS)
Makes computers learn with less data sent.
Lightweight Federated Learning in Mobile Edge Computing with Statistical and Device Heterogeneity Awareness
Systems and Control
Makes phones learn together without sharing private data.