Amortized Inference of Multi-Modal Posteriors using Likelihood-Weighted Normalizing Flows
By: Rajneil Baruah
Potential Business Impact:
Finds hidden patterns in complex data faster.
We present a novel technique for amortized posterior estimation using Normalizing Flows trained with likelihood-weighted importance sampling. This approach allows for the efficient inference of theoretical parameters in high-dimensional inverse problems without the need for posterior training samples. We implement the method on multi-modal benchmark tasks in 2D and 3D to check for the efficacy. A critical observation of our study is the impact of the topology of the base distributions on the modelled posteriors. We find that standard unimodal base distributions fail to capture disconnected support, resulting in spurious probability bridges between modes. We demonstrate that initializing the flow with a Gaussian Mixture Model that matches the cardinality of the target modes significantly improves reconstruction fidelity, as measured by some distance and divergence metrics.
Similar Papers
Adaptive Heterogeneous Mixtures of Normalising Flows for Robust Variational Inference
Machine Learning (CS)
Makes computer guesses better for tricky shapes.
Amortized Sampling with Transferable Normalizing Flows
Machine Learning (CS)
Teaches computers to predict how molecules will move.
Provable Mixed-Noise Learning with Flow-Matching
Machine Learning (CS)
Fixes messy data from science experiments.