Neural Collapse in Test-Time Adaptation
By: Xiao Chen , Zhongjing Du , Jiazhen Huang and more
Potential Business Impact:
Fixes AI mistakes when data changes.
Test-Time Adaptation (TTA) enhances model robustness to out-of-distribution (OOD) data by updating the model online during inference, yet existing methods lack theoretical insights into the fundamental causes of performance degradation under domain shifts. Recently, Neural Collapse (NC) has been proposed as an emergent geometric property of deep neural networks (DNNs), providing valuable insights for TTA. In this work, we extend NC to the sample-wise level and discover a novel phenomenon termed Sample-wise Alignment Collapse (NC3+), demonstrating that a sample's feature embedding, obtained by a trained model, aligns closely with the corresponding classifier weight. Building on NC3+, we identify that the performance degradation stems from sample-wise misalignment in adaptation which exacerbates under larger distribution shifts. This indicates the necessity of realigning the feature embeddings with their corresponding classifier weights. However, the misalignment makes pseudo-labels unreliable under domain shifts. To address this challenge, we propose NCTTA, a novel feature-classifier alignment method with hybrid targets to mitigate the impact of unreliable pseudo-labels, which blends geometric proximity with predictive confidence. Extensive experiments demonstrate the effectiveness of NCTTA in enhancing robustness to domain shifts. For example, NCTTA outperforms Tent by 14.52% on ImageNet-C.
Similar Papers
Open-World Test-Time Adaptation with Hierarchical Feature Aggregation and Attention Affine
CV and Pattern Recognition
Helps AI tell real from fake, even when surprised.
Instance-Aware Test-Time Segmentation for Continual Domain Shifts
CV and Pattern Recognition
Helps AI see better as things change.
Backpropagation-Free Test-Time Adaptation via Probabilistic Gaussian Alignment
CV and Pattern Recognition
Makes AI better at guessing without retraining.