Conformal prediction without knowledge of labeled calibration data
By: Jonas Flechsig, Maximilian Pilz
Potential Business Impact:
Lets computers guess answers with a safety net.
We extend the method of conformal prediction beyond the case relying on labeled calibration data. Replacing the calibration scores by suitable estimates, we identify conformity sets $C$ for classification and regression models that rely on unlabeled calibration data. Given a classification model with accuracy $1-\beta$, we prove that the conformity sets guarantee a coverage of $P(Y \in C) \geq 1-\alpha-\beta$ for an arbitrary parameter $\alpha \in (0,1)$. The same coverage guarantee also holds for regression models, if we replace the accuracy by a similar exactness measure. Finally, we describe how to use the theoretical results in practice.
Similar Papers
On some practical challenges of conformal prediction
Machine Learning (Stat)
Makes computer predictions more reliable and faster.
Efficient Conformal Prediction for Regression Models under Label Noise
Machine Learning (CS)
Makes medical scans more trustworthy with bad data.
Split Conformal Classification with Unsupervised Calibration
Machine Learning (Stat)
Lets computers learn from all data, not just some.