Score: 0

Rates of Convergence of Generalised Variational Inference Posteriors under Prior Misspecification

Published: October 3, 2025 | arXiv ID: 2510.03109v1

By: Terje Mildner, Paris Giampouras, Theodoros Damoulas

Potential Business Impact:

Makes AI learn better even with wrong starting guesses.

Business Areas:
A/B Testing Data and Analytics

We prove rates of convergence and robustness to prior misspecification within a Generalised Variational Inference (GVI) framework with bounded divergences. This addresses a significant open challenge for GVI and Federated GVI that employ a different divergence to the Kullback--Leibler under prior misspecification, operate within a subset of possible probability measures, and result in intractable posteriors. Our theoretical contributions cover severe prior misspecification while relying on our ability to restrict the space of possible GVI posterior measures, and infer properties based on this space. In particular, we are able to establish sufficient conditions for existence and uniqueness of GVI posteriors on arbitrary Polish spaces, prove that the GVI posterior measure concentrates on a neighbourhood of loss minimisers, and extend this to rates of convergence regardless of the prior measure.

Country of Origin
🇬🇧 United Kingdom

Page Count
25 pages

Category
Mathematics:
Statistics Theory