Score: 0

Adaptive Heterogeneous Mixtures of Normalising Flows for Robust Variational Inference

Published: October 2, 2025 | arXiv ID: 2510.02056v1

By: Benjamin Wiriyapong, Oktay Karakuş, Kirill Sidorov

Potential Business Impact:

Makes computer guesses better for tricky shapes.

Business Areas:
A/B Testing Data and Analytics

Normalising-flow variational inference (VI) can approximate complex posteriors, yet single-flow models often behave inconsistently across qualitatively different distributions. We propose Adaptive Mixture Flow Variational Inference (AMF-VI), a heterogeneous mixture of complementary flows (MAF, RealNVP, RBIG) trained in two stages: (i) sequential expert training of individual flows, and (ii) adaptive global weight estimation via likelihood-driven updates, without per-sample gating or architectural changes. Evaluated on six canonical posterior families of banana, X-shape, two-moons, rings, a bimodal, and a five-mode mixture, AMF-VI achieves consistently lower negative log-likelihood than each single-flow baseline and delivers stable gains in transport metrics (Wasserstein-2) and maximum mean discrepancy (MDD), indicating improved robustness across shapes and modalities. The procedure is efficient and architecture-agnostic, incurring minimal overhead relative to standard flow training, and demonstrates that adaptive mixtures of diverse flows provide a reliable route to robust VI across diverse posterior families whilst preserving each expert's inductive bias.

Country of Origin
🇬🇧 United Kingdom

Page Count
12 pages

Category
Computer Science:
Machine Learning (CS)