Score: 2

Conformal Prediction Sets for Deep Generative Models via Reduction to Conformal Regression

Published: March 13, 2025 | arXiv ID: 2503.10512v2

By: Hooman Shahrokhi , Devjeet Raj Roy , Yan Yan and more

Potential Business Impact:

Helps AI make correct answers, not just guesses.

Business Areas:
GPS Hardware, Navigation and Mapping

We consider the problem of generating valid and small prediction sets by sampling outputs (e.g., software code and natural language text) from a black-box deep generative model for a given input (e.g., textual prompt). The validity of a prediction set is determined by a user-defined binary admissibility function depending on the target application. For example, requiring at least one program in the set to pass all test cases in code generation application. To address this problem, we develop a simple and effective conformal inference algorithm referred to as Generative Prediction Sets (GPS). Given a set of calibration examples and black-box access to a deep generative model, GPS can generate prediction sets with provable guarantees. The key insight behind GPS is to exploit the inherent structure within the distribution over the minimum number of samples needed to obtain an admissible output to develop a simple conformal regression approach over the minimum number of samples. Experiments on multiple datasets for code and math word problems using different large language models demonstrate the efficacy of GPS over state-of-the-art methods.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
31 pages

Category
Computer Science:
Machine Learning (CS)