OODD: Test-time Out-of-Distribution Detection with Dynamic Dictionary
By: Yifeng Yang , Lin Zhu , Zewen Sun and more
Potential Business Impact:
Helps computers spot fake or weird data.
Out-of-distribution (OOD) detection remains challenging for deep learning models, particularly when test-time OOD samples differ significantly from training outliers. We propose OODD, a novel test-time OOD detection method that dynamically maintains and updates an OOD dictionary without fine-tuning. Our approach leverages a priority queue-based dictionary that accumulates representative OOD features during testing, combined with an informative inlier sampling strategy for in-distribution (ID) samples. To ensure stable performance during early testing, we propose a dual OOD stabilization mechanism that leverages strategically generated outliers derived from ID data. To our best knowledge, extensive experiments on the OpenOOD benchmark demonstrate that OODD significantly outperforms existing methods, achieving a 26.0% improvement in FPR95 on CIFAR-100 Far OOD detection compared to the state-of-the-art approach. Furthermore, we present an optimized variant of the KNN-based OOD detection framework that achieves a 3x speedup while maintaining detection performance.
Similar Papers
Revisiting Out-of-Distribution Detection in Real-time Object Detection: From Benchmark Pitfalls to a New Mitigation Paradigm
CV and Pattern Recognition
Teaches computers to ignore fake objects.
EOOD: Entropy-based Out-of-distribution Detection
CV and Pattern Recognition
Helps computers know when they see something new.
Graph Out-of-Distribution Detection via Test-Time Calibration with Dual Dynamic Dictionaries
Machine Learning (CS)
Finds weird computer data that doesn't fit.