Score: 2

Federated Domain Generalization with Latent Space Inversion

Published: December 11, 2025 | arXiv ID: 2512.10224v1

By: Ragja Palakkadavath , Hung Le , Thanh Nguyen-Tang and more

Potential Business Impact:

Keeps private data safe while learning from many.

Business Areas:
Darknet Internet Services

Federated domain generalization (FedDG) addresses distribution shifts among clients in a federated learning framework. FedDG methods aggregate the parameters of locally trained client models to form a global model that generalizes to unseen clients while preserving data privacy. While improving the generalization capability of the global model, many existing approaches in FedDG jeopardize privacy by sharing statistics of client data between themselves. Our solution addresses this problem by contributing new ways to perform local client training and model aggregation. To improve local client training, we enforce (domain) invariance across local models with the help of a novel technique, \textbf{latent space inversion}, which enables better client privacy. When clients are not \emph{i.i.d}, aggregating their local models may discard certain local adaptations. To overcome this, we propose an \textbf{important weight} aggregation strategy to prioritize parameters that significantly influence predictions of local models during aggregation. Our extensive experiments show that our approach achieves superior results over state-of-the-art methods with less communication overhead.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡¦πŸ‡Ί Australia, United States

Page Count
10 pages

Category
Computer Science:
Machine Learning (CS)