Adaptive Coverage Policies in Conformal Prediction
By: Etienne Gauthier, Francis Bach, Michael I. Jordan
Potential Business Impact:
Makes predictions more helpful by adjusting their certainty.
Traditional conformal prediction methods construct prediction sets such that the true label falls within the set with a user-specified coverage level. However, poorly chosen coverage levels can result in uninformative predictions, either producing overly conservative sets when the coverage level is too high, or empty sets when it is too low. Moreover, the fixed coverage level cannot adapt to the specific characteristics of each individual example, limiting the flexibility and efficiency of these methods. In this work, we leverage recent advances in e-values and post-hoc conformal inference, which allow the use of data-dependent coverage levels while maintaining valid statistical guarantees. We propose to optimize an adaptive coverage policy by training a neural network using a leave-one-out procedure on the calibration set, allowing the coverage level and the resulting prediction set size to vary with the difficulty of each individual example. We support our approach with theoretical coverage guarantees and demonstrate its practical benefits through a series of experiments.
Similar Papers
Backward Conformal Prediction
Machine Learning (Stat)
Makes predictions smaller and more useful.
Multi-Scale Conformal Prediction: A Theoretical Framework with Coverage Guarantees
Statistics Theory
Makes computer guesses more accurate at different levels.
Conformal Prediction Sets with Improved Conditional Coverage using Trust Scores
Machine Learning (CS)
Helps AI know when it's likely wrong.