Bayesian Prediction under Moment Conditioning
By: Nicholas G. Polson, Daniel Zantedeschi
Potential Business Impact:
Helps computers guess better with missing info.
Prediction is a central task of statistics and machine learning, yet many inferential settings provide only partial information, typically in the form of moment constraints or estimating equations. We develop a finite, fully Bayesian framework for propagating such partial information through predictive distributions. Building on de Finetti's representation theorem, we construct a curvature-adaptive version of exchangeable updating that operates directly under finite constraints, yielding an explicit discrete-Gaussian mixture that quantifies predictive uncertainty. The resulting finite-sample bounds depend on the smallest eigenvalue of the information-geometric Hessian, which measures the curvature and identification strength of the constraint manifold. This approach unifies empirical likelihood, Bayesian empirical likelihood, and generalized method-of-moments estimation within a common predictive geometry. On the operational side, it provides computable curvature-sensitive uncertainty bounds for constrained prediction; on the theoretical side, it recovers de Finetti's coherence, Doob's martingale convergence and local asymptotic normality as limiting cases of the same finite mechanism. Our framework thus offers a constructive bridge between partial information and full Bayesian prediction.
Similar Papers
De Finetti + Sanov = Bayes
Statistics Theory
Makes computers learn from less data.
From Partial Exchangeability to Predictive Probability: A Bayesian Perspective on Classification
Methodology
Helps computers guess better with less data.
The Interplay between Bayesian Inference and Conformal Prediction
Methodology
Combines two math methods for better predictions.