Truthful Elicitation of Imprecise Forecasts
By: Anurag Singh, Siu Lun Chau, Krikamol Muandet
Potential Business Impact:
Helps experts share uncertain guesses better.
The quality of probabilistic forecasts is crucial for decision-making under uncertainty. While proper scoring rules incentivize truthful reporting of precise forecasts, they fall short when forecasters face epistemic uncertainty about their beliefs, limiting their use in safety-critical domains where decision-makers (DMs) prioritize proper uncertainty management. To address this, we propose a framework for scoring imprecise forecasts -- forecasts given as a set of beliefs. Despite existing impossibility results for deterministic scoring rules, we enable truthful elicitation by drawing connection to social choice theory and introducing a two-way communication framework where DMs first share their aggregation rules (e.g., averaging or min-max) used in downstream decisions for resolving forecast ambiguity. This, in turn, helps forecasters resolve indecision during elicitation. We further show that truthful elicitation of imprecise forecasts is achievable using proper scoring rules randomized over the aggregation procedure. Our approach allows DM to elicit and integrate the forecaster's epistemic uncertainty into their decision-making process, thus improving credibility.
Similar Papers
Conditional Forecasts and Proper Scoring Rules for Reliable and Accurate Performative Predictions
Statistics Theory
Makes predictions that don't change what happens.
Proper scoring rules for estimation and forecast evaluation
Statistics Theory
Helps computers guess better and learn more.
Stochastically Dominant Peer Prediction
CS and Game Theory
Makes AI learn truth from people better.