Score: 0

Robust Conformal Prediction with a Single Binary Certificate

Published: March 7, 2025 | arXiv ID: 2503.05239v1

By: Soroush H. Zargarbashi, Aleksandar Bojchevski

Potential Business Impact:

Makes AI predictions more reliable and faster.

Business Areas:
A/B Testing Data and Analytics

Conformal prediction (CP) converts any model's output to prediction sets with a guarantee to cover the true label with (adjustable) high probability. Robust CP extends this guarantee to worst-case (adversarial) inputs. Existing baselines achieve robustness by bounding randomly smoothed conformity scores. In practice, they need expensive Monte-Carlo (MC) sampling (e.g. $\sim10^4$ samples per point) to maintain an acceptable set size. We propose a robust conformal prediction that produces smaller sets even with significantly lower MC samples (e.g. 150 for CIFAR10). Our approach binarizes samples with an adjustable (or automatically adjusted) threshold selected to preserve the coverage guarantee. Remarkably, we prove that robustness can be achieved by computing only one binary certificate, unlike previous methods that certify each calibration (or test) point. Thus, our method is faster and returns smaller robust sets. We also eliminate a previous limitation that requires a bounded score function.

Country of Origin
🇩🇪 Germany

Page Count
26 pages

Category
Computer Science:
Machine Learning (CS)