Score: 0

Hybrid Federated Learning for Noise-Robust Training

Published: January 8, 2026 | arXiv ID: 2601.04483v1

By: Yongjun Kim , Hyeongjun Park , Hwanjin Kim and more

Potential Business Impact:

Helps phones learn together without sharing private info.

Business Areas:
E-Learning Education, Software

Federated learning (FL) and federated distillation (FD) are distributed learning paradigms that train UE models with enhanced privacy, each offering different trade-offs between noise robustness and learning speed. To mitigate their respective weaknesses, we propose a hybrid federated learning (HFL) framework in which each user equipment (UE) transmits either gradients or logits, and the base station (BS) selects the per-round weights of FL and FD updates. We derive convergence of HFL framework and introduce two methods to exploit degrees of freedom (DoF) in HFL, which are (i) adaptive UE clustering via Jenks optimization and (ii) adaptive weight selection via a damped Newton method. Numerical results show that HFL achieves superior test accuracy at low SNR when both DoF are exploited.

Country of Origin
🇰🇷 Korea, Republic of

Page Count
5 pages

Category
Computer Science:
Machine Learning (CS)