Breaking the Aggregation Bottleneck in Federated Recommendation: A Personalized Model Merging Approach
By: Jundong Chen , Honglei Zhang , Chunxu Zhang and more
Potential Business Impact:
Keeps movie suggestions personal when learning together.
Federated recommendation (FR) facilitates collaborative training by aggregating local models from massive devices, enabling client-specific personalization while ensuring privacy. However, we empirically and theoretically demonstrate that server-side aggregation can undermine client-side personalization, leading to suboptimal performance, which we term the aggregation bottleneck. This issue stems from the inherent heterogeneity across numerous clients in FR, which drives the globally aggregated model to deviate from local optima. To this end, we propose FedEM, which elastically merges the global and local models to compensate for impaired personalization. Unlike existing personalized federated recommendation (pFR) methods, FedEM (1) investigates the aggregation bottleneck in FR through theoretical insights, rather than relying on heuristic analysis; (2) leverages off-the-shelf local models rather than designing additional mechanisms to boost personalization. Extensive experiments on real-world datasets demonstrate that our method preserves client personalization during collaborative training, outperforming state-of-the-art baselines.
Similar Papers
Beyond Aggregation: Guiding Clients in Heterogeneous Federated Learning
Machine Learning (CS)
Directs patients to the best hospital for them.
Local Performance vs. Out-of-Distribution Generalization: An Empirical Analysis of Personalized Federated Learning in Heterogeneous Data Environments
Machine Learning (CS)
Helps AI learn better from different data.
Multimodal-enhanced Federated Recommendation: A Group-wise Fusion Approach
Information Retrieval
Shares user tastes privately for better movie picks.