Sequential Monte Carlo approximations of Wasserstein--Fisher--Rao gradient flows
By: Francesca R. Crucinio, Sahani Pathiraja
Potential Business Impact:
Makes computers guess better by learning from examples.
We consider the problem of sampling from a probability distribution $\pi$. It is well known that this can be written as an optimisation problem over the space of probability distribution in which we aim to minimise the Kullback--Leibler divergence from $\pi$. We consider several partial differential equations (PDEs) whose solution is a minimiser of the Kullback--Leibler divergence from $\pi$ and connect them to well-known Monte Carlo algorithms. We focus in particular on PDEs obtained by considering the Wasserstein--Fisher--Rao geometry over the space of probabilities and show that these lead to a natural implementation using importance sampling and sequential Monte Carlo. We propose a novel algorithm to approximate the Wasserstein--Fisher--Rao flow of the Kullback--Leibler divergence which empirically outperforms the current state-of-the-art. We study tempered versions of these PDEs obtained by replacing the target distribution with a geometric mixture of initial and target distribution and show that these do not lead to a convergence speed up.
Similar Papers
Gradient Flow Sampler-based Distributionally Robust Optimization
Optimization and Control
Finds best solutions by checking worst-case scenarios.
Gradient Flow Sampler-based Distributionally Robust Optimization
Optimization and Control
Finds the best way to make smart choices.
An operator splitting analysis of Wasserstein--Fisher--Rao gradient flows
Machine Learning (Stat)
Makes computer learning faster by mixing two math tricks.