Robust Federated Fine-Tuning in Heterogeneous Networks with Unreliable Connections: An Aggregation View
By: Yanmeng Wang , Zhiwen Dai , Shuai Wang and more
Potential Business Impact:
Fixes AI learning when internet is bad.
Federated Fine-Tuning (FFT) has attracted growing interest as it leverages both server- and client-side data to enhance global model generalization while preserving privacy, and significantly reduces the computational burden on edge devices by avoiding training from scratch. Despite these advantages, FFT performance is often degraded by unreliable server-client connections and heterogeneous client data distributions. Most existing methods assume homogeneous network conditions or require prior knowledge of connection failures. However, these assumptions are impractical in real-world networks characterized by diverse communication standards (e.g., wired, Wi-Fi, 4G, and 5G) and heterogeneous failure patterns. To address these limitations, we propose FedAuto, a novel FFT framework that mitigates the combined effects of connection failures and data heterogeneity via adaptive aggregation. FedAuto operates without prior knowledge of network conditions or modifications to existing infrastructure, enabling seamless plug-and-play deployment. Moreover, we establish a rigorous convergence guarantee for FedAuto. By adopting a novel per-round aggregation perspective, our analysis removes the need for assumptions on connection failures probabilities or client selection strategies commonly imposed in prior work, and guarantees convergence of FedAuto for each individual realization, providing a stronger theoretical assurance. Extensive experiments demonstrate that FedAuto consistently outperforms state-of-the-art baselines under diverse connection failure scenarios for both full-parameter and partial-parameter fine-tuning (e.g., LoRA), and even surpasses strategies that rely on complex communication resource optimization.
Similar Papers
FedHFT: Efficient Federated Finetuning with Heterogeneous Edge Clients
Machine Learning (CS)
Helps AI learn from private data securely.
H2Tune: Federated Foundation Model Fine-Tuning with Hybrid Heterogeneity
Machine Learning (CS)
Helps AI learn from different computers better.
FTTE: Federated Learning on Resource-Constrained Devices
Machine Learning (CS)
Trains AI faster on small devices.