Stabilizing Open-Set Test-Time Adaptation via Primary-Auxiliary Filtering and Knowledge-Integrated Prediction
By: Byung-Joon Lee, Jin-Seop Lee, Jee-Hyong Lee
Potential Business Impact:
Helps AI recognize more things, even new ones.
Deep neural networks demonstrate strong performance under aligned training-test distributions. However, real-world test data often exhibit domain shifts. Test-Time Adaptation (TTA) addresses this challenge by adapting the model to test data during inference. While most TTA studies assume that the training and test data share the same class set (closed-set TTA), real-world scenarios often involve open-set data (open-set TTA), which can degrade closed-set accuracy. A recent study showed that identifying open-set data during adaptation and maximizing its entropy is an effective solution. However, the previous method relies on the source model for filtering, resulting in suboptimal filtering accuracy on domain-shifted test data. In contrast, we found that the adapting model, which learns domain knowledge from noisy test streams, tends to be unstable and leads to error accumulation when used for filtering. To address this problem, we propose Primary-Auxiliary Filtering (PAF), which employs an auxiliary filter to validate data filtered by the primary filter. Furthermore, we propose Knowledge-Integrated Prediction (KIP), which calibrates the outputs of the adapting model, EMA model, and source model to integrate their complementary knowledge for OSTTA. We validate our approach across diverse closed-set and open-set datasets. Our method enhances both closed-set accuracy and open-set discrimination over existing methods. The code is available at https://github.com/powerpowe/PAF-KIP-OSTTA .
Similar Papers
Open-World Test-Time Adaptation with Hierarchical Feature Aggregation and Attention Affine
CV and Pattern Recognition
Helps AI tell real from fake, even when surprised.
Test-Time Model Adaptation for Quantized Neural Networks
CV and Pattern Recognition
Helps self-driving cars work better in changing weather.
Uncover and Unlearn Nuisances: Agnostic Fully Test-Time Adaptation
Machine Learning (CS)
Helps computers learn from new data without old examples.