A Few Observations on Sample-Conditional Coverage in Conformal Prediction
By: John C. Duchi
Potential Business Impact:
Makes predictions more trustworthy and accurate.
We revisit the problem of constructing predictive confidence sets for which we wish to obtain some type of conditional validity. We provide new arguments showing how ``split conformal'' methods achieve near desired coverage levels with high probability, a guarantee conditional on the validation data rather than marginal over it. In addition, we directly consider (approximate) conditional coverage, where, e.g., conditional on a covariate $X$ belonging to some group of interest, we would like a guarantee that a predictive set covers the true outcome $Y$. We show that the natural method of performing quantile regression on a held-out (validation) dataset yields minimax optimal guarantees of coverage here. Complementing these positive results, we also provide experimental evidence that interesting work remains to be done to develop computationally efficient but valid predictive inference methods.
Similar Papers
Fair Conformal Prediction for Incomplete Covariate Data
Methodology
Makes computer predictions more honest about guessing.
Improving the statistical efficiency of cross-conformal prediction
Machine Learning (Stat)
Makes computer guesses more accurate and smaller.
Probabilistic Conformal Coverage Guarantees in Small-Data Settings
Machine Learning (CS)
Guarantees predictions are correct more often.