Feedback-Enhanced Online Multiple Testing with Applications to Conformal Selection
By: Lin Lu , Yuyang Huo , Haojie Ren and more
Potential Business Impact:
Improves computer decisions with new feedback.
We study online multiple testing with feedback, where decisions are made sequentially and the true state of the hypothesis is revealed after the decision has been made, either instantly or with a delay. We propose GAIF, a feedback-enhanced generalized alpha-investing framework that dynamically adjusts thresholds using revealed outcomes, ensuring finite-sample false discovery rate (FDR)/marginal FDR control. Extending GAIF to online conformal testing, we construct independent conformal $p$-values and introduce a feedback-driven model selection criterion to identify the best model/score, thereby improving statistical power. We demonstrate the effectiveness of our methods through numerical simulations and real-data applications.
Similar Papers
e-GAI: e-value-based Generalized $α$-Investing for Online False Discovery Rate Control
Methodology
Finds important patterns without too many false alarms.
Unified Conformalized Multiple Testing with Full Data Efficiency
Methodology
Improves computer decisions by using all information.
Conformalized Multiple Testing under Unknown Null Distribution with Symmetric Errors
Methodology
Finds more true discoveries in big data.