Score: 2

Global Variational Inference Enhanced Robust Domain Adaptation

Published: July 4, 2025 | arXiv ID: 2507.03291v1

By: Lingkun Luo, Shiqiang Hu, Liming Chen

Potential Business Impact:

Helps computers learn from different data better.

Business Areas:
A/B Testing Data and Analytics

Deep learning-based domain adaptation (DA) methods have shown strong performance by learning transferable representations. However, their reliance on mini-batch training limits global distribution modeling, leading to unstable alignment and suboptimal generalization. We propose Global Variational Inference Enhanced Domain Adaptation (GVI-DA), a framework that learns continuous, class-conditional global priors via variational inference to enable structure-aware cross-domain alignment. GVI-DA minimizes domain gaps through latent feature reconstruction, and mitigates posterior collapse using global codebook learning with randomized sampling. It further improves robustness by discarding low-confidence pseudo-labels and generating reliable target-domain samples. Extensive experiments on four benchmarks and thirty-eight DA tasks demonstrate consistent state-of-the-art performance. We also derive the model's evidence lower bound (ELBO) and analyze the effects of prior continuity, codebook size, and pseudo-label noise tolerance. In addition, we compare GVI-DA with diffusion-based generative frameworks in terms of optimization principles and efficiency, highlighting both its theoretical soundness and practical advantages.

Country of Origin
🇫🇷 🇨🇳 China, France

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)