Local Background Features Matter in Out-of-Distribution Detection
By: Jinlun Ye , Zhuohao Sun , Yiqiao Qiu and more
Potential Business Impact:
Helps computers know when they see something new.
Out-of-distribution (OOD) detection is crucial when deploying deep neural networks in the real world to ensure the reliability and safety of their applications. One main challenge in OOD detection is that neural network models often produce overconfident predictions on OOD data. While some methods using auxiliary OOD datasets or generating fake OOD images have shown promising OOD detection performance, they are limited by the high costs of data collection and training. In this study, we propose a novel and effective OOD detection method that utilizes local background features as fake OOD features for model training. Inspired by the observation that OOD images generally share similar background regions with ID images, the background features are extracted from ID images as simulated OOD visual representations during training based on the local invariance of convolution. Through being optimized to reduce the $L_2$-norm of these background features, the neural networks are able to alleviate the overconfidence issue on OOD data. Extensive experiments on multiple standard OOD detection benchmarks confirm the effectiveness of our method and its wide combinatorial compatibility with existing post-hoc methods, with new state-of-the-art performance achieved from our method.
Similar Papers
BootOOD: Self-Supervised Out-of-Distribution Detection via Synthetic Sample Exposure under Neural Collapse
CV and Pattern Recognition
Helps computers spot fake pictures, even tricky ones.
Enhancing Abnormality Identification: Robust Out-of-Distribution Strategies for Deepfake Detection
CV and Pattern Recognition
Finds fake videos even from new tricks.
BOOD: Boundary-based Out-Of-Distribution Data Generation
Machine Learning (CS)
Helps computers spot fake images better.