Score: 0

Bayesian Conformal Prediction via the Bayesian Bootstrap

Published: August 2, 2025 | arXiv ID: 2508.01418v1

By: Graham Gibson

Potential Business Impact:

Makes predictions' uncertainty estimates reliable

Reliable uncertainty quantification remains a central challenge in predictive modeling. While Bayesian methods are theoretically appealing, their predictive intervals can exhibit poor frequentist calibration, particularly with small sample sizes or model misspecification. We introduce a practical and broadly applicable Bayesian conformal approach based on the influence-function Bayesian bootstrap (BB) with data-driven tuning of the Dirichlet concentration parameter, {\alpha}. By efficiently approximating the Bayesian bootstrap predictive distribution via influence functions and calibrating {\alpha} to optimize empirical coverage or average log-probability, our method constructs prediction intervals and distributions that are both well-calibrated and sharp. Across a range of regression models and data settings, this Bayesian conformal framework consistently yields improved empirical coverage and log-score compared to standard Bayesian posteriors. Our procedure is fast, easy to implement, and offers a flexible approach for distributional calibration in predictive modeling.

Page Count
16 pages

Category
Statistics:
Methodology