Score: 2

Revisiting Likelihood-Based Out-of-Distribution Detection by Modeling Representations

Published: April 10, 2025 | arXiv ID: 2504.07793v3

By: Yifan Ding , Arturas Aleksandraus , Amirhossein Ahmadian and more

Potential Business Impact:

Makes AI better at spotting fake or wrong information.

Business Areas:
Image Recognition Data and Analytics, Software

Out-of-distribution (OOD) detection is critical for ensuring the reliability of deep learning systems, particularly in safety-critical applications. Likelihood-based deep generative models have historically faced criticism for their unsatisfactory performance in OOD detection, often assigning higher likelihood to OOD data than in-distribution samples when applied to image data. In this work, we demonstrate that likelihood is not inherently flawed. Rather, several properties in the images space prohibit likelihood as a valid detection score. Given a sufficiently good likelihood estimator, specifically using the probability flow formulation of a diffusion model, we show that likelihood-based methods can still perform on par with state-of-the-art methods when applied in the representation space of pre-trained encoders. The code of our work can be found at $\href{https://github.com/limchaos/Likelihood-OOD.git}{\texttt{https://github.com/limchaos/Likelihood-OOD.git}}$.

Country of Origin
πŸ‡ΈπŸ‡ͺ Sweden

Repos / Data Links

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)