Score: 1

Out-of-Distribution Detection using Counterfactual Distance

Published: August 13, 2025 | arXiv ID: 2508.10148v1

By: Maria Stoica, Francesco Leofante, Alessio Lomuscio

Potential Business Impact:

Helps computers know when they see something new.

Accurate and explainable out-of-distribution (OOD) detection is required to use machine learning systems safely. Previous work has shown that feature distance to decision boundaries can be used to identify OOD data effectively. In this paper, we build on this intuition and propose a post-hoc OOD detection method that, given an input, calculates the distance to decision boundaries by leveraging counterfactual explanations. Since computing explanations can be expensive for large architectures, we also propose strategies to improve scalability by computing counterfactuals directly in embedding space. Crucially, as the method employs counterfactual explanations, we can seamlessly use them to help interpret the results of our detector. We show that our method is in line with the state of the art on CIFAR-10, achieving 93.50% AUROC and 25.80% FPR95. Our method outperforms these methods on CIFAR-100 with 97.05% AUROC and 13.79% FPR95 and on ImageNet-200 with 92.55% AUROC and 33.55% FPR95 across four OOD datasets

Country of Origin
🇬🇧 United Kingdom

Page Count
11 pages

Category
Computer Science:
Machine Learning (CS)