Score: 1

Polysemantic Dropout: Conformal OOD Detection for Specialized LLMs

Published: September 4, 2025 | arXiv ID: 2509.04655v1

By: Ayush Gupta , Ramneet Kaur , Anirban Roy and more

Potential Business Impact:

Keeps smart computer programs from making mistakes.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We propose a novel inference-time out-of-domain (OOD) detection algorithm for specialized large language models (LLMs). Despite achieving state-of-the-art performance on in-domain tasks through fine-tuning, specialized LLMs remain vulnerable to incorrect or unreliable outputs when presented with OOD inputs, posing risks in critical applications. Our method leverages the Inductive Conformal Anomaly Detection (ICAD) framework, using a new non-conformity measure based on the model's dropout tolerance. Motivated by recent findings on polysemanticity and redundancy in LLMs, we hypothesize that in-domain inputs exhibit higher dropout tolerance than OOD inputs. We aggregate dropout tolerance across multiple layers via a valid ensemble approach, improving detection while maintaining theoretical false alarm bounds from ICAD. Experiments with medical-specialized LLMs show that our approach detects OOD inputs better than baseline methods, with AUROC improvements of $2\%$ to $37\%$ when treating OOD datapoints as positives and in-domain test datapoints as negatives.

Page Count
13 pages

Category
Computer Science:
Computation and Language