Unified Conformalized Multiple Testing with Full Data Efficiency
By: Yuyang Huo , Xiaoyang Wu , Changliang Zou and more
Potential Business Impact:
Improves computer decisions by using all information.
Conformalized multiple testing offers a model-free way to control predictive uncertainty in decision-making. Existing methods typically use only part of the available data to build score functions tailored to specific settings. We propose a unified framework that puts data utilization at the center: it uses all available data-null, alternative, and unlabeled-to construct scores and calibrate p-values through a full permutation strategy. This unified use of all available data significantly improves power by enhancing non-conformity score quality and maximizing calibration set size while rigorously controlling the false discovery rate. Crucially, our framework provides a systematic design principle for conformal testing and enables automatic selection of the best conformal procedure among candidates without extra data splitting. Extensive numerical experiments demonstrate that our enhanced methods deliver superior efficiency and adaptability across diverse scenarios.
Similar Papers
Feedback-Enhanced Online Multiple Testing with Applications to Conformal Selection
Methodology
Improves computer decisions with new feedback.
SConU: Selective Conformal Uncertainty in Large Language Models
Computation and Language
Makes AI predictions more trustworthy and reliable.
Conformal prediction without knowledge of labeled calibration data
Methodology
Lets computers guess answers with a safety net.