Adaptive Heterogeneous Mixtures of Normalising Flows for Robust Variational Inference
By: Benjamin Wiriyapong, Oktay Karakuş, Kirill Sidorov
Potential Business Impact:
Makes computer guesses better for tricky shapes.
Normalising-flow variational inference (VI) can approximate complex posteriors, yet single-flow models often behave inconsistently across qualitatively different distributions. We propose Adaptive Mixture Flow Variational Inference (AMF-VI), a heterogeneous mixture of complementary flows (MAF, RealNVP, RBIG) trained in two stages: (i) sequential expert training of individual flows, and (ii) adaptive global weight estimation via likelihood-driven updates, without per-sample gating or architectural changes. Evaluated on six canonical posterior families of banana, X-shape, two-moons, rings, a bimodal, and a five-mode mixture, AMF-VI achieves consistently lower negative log-likelihood than each single-flow baseline and delivers stable gains in transport metrics (Wasserstein-2) and maximum mean discrepancy (MDD), indicating improved robustness across shapes and modalities. The procedure is efficient and architecture-agnostic, incurring minimal overhead relative to standard flow training, and demonstrates that adaptive mixtures of diverse flows provide a reliable route to robust VI across diverse posterior families whilst preserving each expert's inductive bias.
Similar Papers
Amortized variational transdimensional inference
Computation
Lets computers learn from changing amounts of information.
Amortized Inference of Multi-Modal Posteriors using Likelihood-Weighted Normalizing Flows
Machine Learning (CS)
Finds hidden patterns in complex data faster.
Improved Mean Flows: On the Challenges of Fastforward Generative Models
CV and Pattern Recognition
Makes AI create pictures faster and better.