Score: 1

FedSDAF: Leveraging Source Domain Awareness for Enhanced Federated Domain Generalization

Published: May 5, 2025 | arXiv ID: 2505.02515v3

By: Hongze Li , Zesheng Zhou , Zhenbiao Cao and more

Potential Business Impact:

Teaches computers to learn better from different sources.

Business Areas:
Semantic Search Internet Services

Traditional Federated Domain Generalization (FedDG) methods focus on learning domain-invariant features or adapting to unseen target domains, often overlooking the unique knowledge embedded within the source domain, especially in strictly isolated federated learning environments. Through experimentation, we discovered a counterintuitive phenomenon.: features learned from a complete source domain have superior generalization capabilities compared to those learned directly from the target domain. This insight leads us to propose the Federated Source Domain Awareness Framework (FedSDAF), the first systematic approach to enhance FedDG by leveraging source domain-aware features. FedSDAF employs a dual-adapter architecture that decouples "local expertise" from "global generalization consensus". A Domain-Aware Adapter, retained locally, extracts and protects the unique discriminative knowledge of each source domain, while a Domain-Invariant Adapter, shared across clients, builds a robust global consensus. To enable knowledge exchange, we introduce a Bidirectional Knowledge Distillation mechanism that facilitates efficient dialogue between the adapters. Extensive experiments on four benchmark datasets (OfficeHome, PACS, VLCS, DomainNet) show that FedSDAF significantly outperforms existing FedDG methods.The source code is available at https://github.com/pizzareapers/FedSDAF.

Country of Origin
🇨🇳 China

Repos / Data Links

Page Count
19 pages

Category
Computer Science:
Machine Learning (CS)