Score: 1

Class-invariant Test-Time Augmentation for Domain Generalization

Published: September 17, 2025 | arXiv ID: 2509.14420v1

By: Zhicheng Lin, Xiaolin Wu, Xi Zhang

Potential Business Impact:

Makes AI work better with new, unseen pictures.

Business Areas:
A/B Testing Data and Analytics

Deep models often suffer significant performance degradation under distribution shifts. Domain generalization (DG) seeks to mitigate this challenge by enabling models to generalize to unseen domains. Most prior approaches rely on multi-domain training or computationally intensive test-time adaptation. In contrast, we propose a complementary strategy: lightweight test-time augmentation. Specifically, we develop a novel Class-Invariant Test-Time Augmentation (CI-TTA) technique. The idea is to generate multiple variants of each input image through elastic and grid deformations that nevertheless belong to the same class as the original input. Their predictions are aggregated through a confidence-guided filtering scheme that remove unreliable outputs, ensuring the final decision relies on consistent and trustworthy cues. Extensive Experiments on PACS and Office-Home datasets demonstrate consistent gains across different DG algorithms and backbones, highlighting the effectiveness and generality of our approach.

Repos / Data Links

Page Count
5 pages

Category
Computer Science:
CV and Pattern Recognition