Score: 2

GC-Fed: Gradient Centralized Federated Learning with Partial Client Participation

Published: March 17, 2025 | arXiv ID: 2503.13180v2

By: Jungwon Seo , Ferhat Ozgur Catak , Chunming Rong and more

Potential Business Impact:

Keeps AI learning fair even with different data.

Business Areas:
Content Delivery Network Content and Publishing

Federated Learning (FL) enables privacy-preserving multi-source information fusion (MSIF) but is challenged by client drift in highly heterogeneous data settings. Many existing drift-mitigation strategies rely on reference-based techniques--such as gradient adjustments or proximal loss--that use historical snapshots (e.g., past gradients or previous global models) as reference points. When only a subset of clients participates in each training round, these historical references may not accurately capture the overall data distribution, leading to unstable training. In contrast, our proposed Gradient Centralized Federated Learning (GC-Fed) employs a hyperplane as a historically independent reference point to guide local training and enhance inter-client alignment. GC-Fed comprises two complementary components: Local GC, which centralizes gradients during local training, and Global GC, which centralizes updates during server aggregation. In our hybrid design, Local GC is applied to feature-extraction layers to harmonize client contributions, while Global GC refines classifier layers to stabilize round-wise performance. Theoretical analysis and extensive experiments on benchmark FL tasks demonstrate that GC-Fed effectively mitigates client drift and achieves up to a 20% improvement in accuracy under heterogeneous and partial participation conditions.

Country of Origin
🇳🇴 🇰🇷 Norway, Korea, Republic of

Repos / Data Links

Page Count
21 pages

Category
Computer Science:
Machine Learning (CS)