Score: 2

SpeedCP: Fast Kernel-based Conditional Conformal Prediction

Published: September 28, 2025 | arXiv ID: 2509.24100v1

By: Yeo Jin Jung , Yating Liu , Zixuan Wu and more

Potential Business Impact:

Makes computer predictions more trustworthy and faster.

Business Areas:
A/B Testing Data and Analytics

Conformal prediction provides distribution-free prediction sets with finite-sample conditional guarantees. We build upon the RKHS-based framework of Gibbs et al. (2023), which leverages families of covariate shifts to provide approximate conditional conformal prediction intervals, an approach with strong theoretical promise, but with prohibitive computational cost. To bridge this gap, we develop a stable and efficient algorithm that computes the full solution path of the regularized RKHS conformal optimization problem, at essentially the same cost as a single kernel quantile fit. Our path-tracing framework simultaneously tunes hyperparameters, providing smoothness control and data-adaptive calibration. To extend the method to high-dimensional settings, we further integrate our approach with low-rank latent embeddings that capture conditional validity in a data-driven latent space. Empirically, our method provides reliable conditional coverage across a variety of modern black-box predictors, improving the interval length of Gibbs et al. (2023) by 30%, while achieving a 40-fold speedup.

Country of Origin
🇺🇸 United States


Page Count
38 pages

Category
Statistics:
Methodology