Improving Out-of-Distribution Detection via Dynamic Covariance Calibration
By: Kaiyu Guo , Zijian Wang , Tan Pan and more
Potential Business Impact:
Helps AI spot fake or wrong information.
Out-of-Distribution (OOD) detection is essential for the trustworthiness of AI systems. Methods using prior information (i.e., subspace-based methods) have shown effective performance by extracting information geometry to detect OOD data with a more appropriate distance metric. However, these methods fail to address the geometry distorted by ill-distributed samples, due to the limitation of statically extracting information geometry from the training distribution. In this paper, we argue that the influence of ill-distributed samples can be corrected by dynamically adjusting the prior geometry in response to new data. Based on this insight, we propose a novel approach that dynamically updates the prior covariance matrix using real-time input features, refining its information. Specifically, we reduce the covariance along the direction of real-time input features and constrain adjustments to the residual space, thus preserving essential data characteristics and avoiding effects on unintended directions in the principal space. We evaluate our method on two pre-trained models for the CIFAR dataset and five pre-trained models for ImageNet-1k, including the self-supervised DINO model. Extensive experiments demonstrate that our approach significantly enhances OOD detection across various models. The code is released at https://github.com/workerbcd/ooddcc.
Similar Papers
EigenScore: OOD Detection using Covariance in Diffusion Models
CV and Pattern Recognition
Finds fake data that could trick smart computers.
OODD: Test-time Out-of-Distribution Detection with Dynamic Dictionary
CV and Pattern Recognition
Helps computers spot fake or weird data.
Graph Out-of-Distribution Detection via Test-Time Calibration with Dual Dynamic Dictionaries
Machine Learning (CS)
Finds weird computer data that doesn't fit.