Rates of Convergence of Generalised Variational Inference Posteriors under Prior Misspecification
By: Terje Mildner, Paris Giampouras, Theodoros Damoulas
Potential Business Impact:
Makes AI learn better even with wrong starting guesses.
We prove rates of convergence and robustness to prior misspecification within a Generalised Variational Inference (GVI) framework with bounded divergences. This addresses a significant open challenge for GVI and Federated GVI that employ a different divergence to the Kullback--Leibler under prior misspecification, operate within a subset of possible probability measures, and result in intractable posteriors. Our theoretical contributions cover severe prior misspecification while relying on our ability to restrict the space of possible GVI posterior measures, and infer properties based on this space. In particular, we are able to establish sufficient conditions for existence and uniqueness of GVI posteriors on arbitrary Polish spaces, prove that the GVI posterior measure concentrates on a neighbourhood of loss minimisers, and extend this to rates of convergence regardless of the prior measure.
Similar Papers
Geometric Convergence Analysis of Variational Inference via Bregman Divergences
Machine Learning (Stat)
Helps computers learn better by understanding math.
Variational Inference with Mixtures of Isotropic Gaussians
Machine Learning (Stat)
Finds better computer guesses for complex problems.
Generalized Guarantees for Variational Inference in the Presence of Even and Elliptical Symmetry
Machine Learning (Stat)
Makes computer guesses about data more accurate.