Fundamental bounds on efficiency-confidence trade-off for transductive conformal prediction
By: Arash Behboodi , Alvaro H. C. Correia , Fabio Valerio Massoli and more
Potential Business Impact:
Makes computer guesses more honest about being wrong.
Transductive conformal prediction addresses the simultaneous prediction for multiple data points. Given a desired confidence level, the objective is to construct a prediction set that includes the true outcomes with the prescribed confidence. We demonstrate a fundamental trade-off between confidence and efficiency in transductive methods, where efficiency is measured by the size of the prediction sets. Specifically, we derive a strict finite-sample bound showing that any non-trivial confidence level leads to exponential growth in prediction set size for data with inherent uncertainty. The exponent scales linearly with the number of samples and is proportional to the conditional entropy of the data. Additionally, the bound includes a second-order term, dispersion, defined as the variance of the log conditional probability distribution. We show that this bound is achievable in an idealized setting. Finally, we examine a special case of transductive prediction where all test data points share the same label. We show that this scenario reduces to the hypothesis testing problem with empirically observed statistics and provide an asymptotically optimal confidence predictor, along with an analysis of the error exponent.
Similar Papers
Interpretable Multivariate Conformal Prediction with Fast Transductive Standardization
Methodology
Helps predict many things accurately at once.
On some practical challenges of conformal prediction
Machine Learning (Stat)
Makes computer predictions more reliable and faster.
Cost-Sensitive Conformal Training with Provably Controllable Learning Bounds
Machine Learning (CS)
Makes AI predictions more accurate and reliable.