Score: 0

Prompt Optimization Meets Subspace Representation Learning for Few-shot Out-of-Distribution Detection

Published: September 9, 2025 | arXiv ID: 2509.18111v1

By: Faizul Rakib Sayem, Shahana Ibrahim

Potential Business Impact:

AI spots new things it hasn't seen before.

Business Areas:
Image Recognition Data and Analytics, Software

The reliability of artificial intelligence (AI) systems in open-world settings depends heavily on their ability to flag out-of-distribution (OOD) inputs unseen during training. Recent advances in large-scale vision-language models (VLMs) have enabled promising few-shot OOD detection frameworks using only a handful of in-distribution (ID) samples. However, existing prompt learning-based OOD methods rely solely on softmax probabilities, overlooking the rich discriminative potential of the feature embeddings learned by VLMs trained on millions of samples. To address this limitation, we propose a novel context optimization (CoOp)-based framework that integrates subspace representation learning with prompt tuning. Our approach improves ID-OOD separability by projecting the ID features into a subspace spanned by prompt vectors, while projecting ID-irrelevant features into an orthogonal null space. To train such OOD detection framework, we design an easy-to-handle end-to-end learning criterion that ensures strong OOD detection performance as well as high ID classification accuracy. Experiments on real-world datasets showcase the effectiveness of our approach.

Page Count
13 pages

Category
Computer Science:
Machine Learning (CS)