Score: 1

Exploiting Inter-Sample Information for Long-tailed Out-of-Distribution Detection

Published: November 20, 2025 | arXiv ID: 2511.16015v1

By: Nimeshika Udayangani , Hadi M. Dolatabadi , Sarah Erfani and more

Potential Business Impact:

Helps computers spot fake or wrong pictures.

Business Areas:
Image Recognition Data and Analytics, Software

Detecting out-of-distribution (OOD) data is essential for safe deployment of deep neural networks (DNNs). This problem becomes particularly challenging in the presence of long-tailed in-distribution (ID) datasets, often leading to high false positive rates (FPR) and low tail-class ID classification accuracy. In this paper, we demonstrate that exploiting inter-sample relationships using a graph-based representation can significantly improve OOD detection in long-tailed recognition of vision datasets. To this end, we use the feature space of a pre-trained model to initialize our graph structure. We account for the differences between the activation layer distribution of the pre-training vs. training data, and actively introduce Gaussianization to alleviate any deviations from a standard normal distribution in the activation layers of the pre-trained model. We then refine this initial graph representation using graph convolutional networks (GCNs) to arrive at a feature space suitable for long-tailed OOD detection. This leads us to address the inferior performance observed in ID tail-classes within existing OOD detection methods. Experiments over three benchmarks CIFAR10-LT, CIFAR100-LT, and ImageNet-LT demonstrate that our method outperforms the state-of-the-art approaches by a large margin in terms of FPR and tail-class ID classification accuracy.

Repos / Data Links

Page Count
17 pages

Category
Computer Science:
CV and Pattern Recognition