Score: 1

Location--Scale Calibration for Generalized Posterior

Published: November 19, 2025 | arXiv ID: 2511.15320v2

By: Shu Tamano, Yui Tomo

Potential Business Impact:

Makes computer learning more reliable, no matter the settings.

Business Areas:
A/B Testing Data and Analytics

General Bayesian updating replaces the likelihood with a loss scaled by a learning rate, but posterior uncertainty can depend sharply on that scale. We propose a simple post-processing that aligns generalized posterior draws with their asymptotic target, yielding uncertainty quantification that is invariant to the learning rate. We prove total-variation convergence for generalized posteriors with an effective sample size, allowing sample-size-dependent priors, non-i.i.d. observations, and convex penalties under model misspecification. Within this framework, we justify and extend the open-faced sandwich adjustment (Shaby, 2014), provide general theoretical guarantees for its use within generalized Bayes, and extend it from covariance rescaling to a location--scale calibration whose draws converge in total variation to the target for any learning rate. In our empirical illustration, calibrated draws maintain stable coverage, interval width, and bias over orders of magnitude in the learning rate and closely track frequentist benchmarks, whereas uncalibrated posteriors vary markedly.

Country of Origin
🇯🇵 Japan

Repos / Data Links

Page Count
18 pages

Category
Statistics:
Methodology