Score: 2

Sequential Monte Carlo approximations of Wasserstein--Fisher--Rao gradient flows

Published: June 6, 2025 | arXiv ID: 2506.05905v1

By: Francesca R. Crucinio, Sahani Pathiraja

Potential Business Impact:

Makes computers guess better by learning from examples.

Business Areas:
A/B Testing Data and Analytics

We consider the problem of sampling from a probability distribution $\pi$. It is well known that this can be written as an optimisation problem over the space of probability distribution in which we aim to minimise the Kullback--Leibler divergence from $\pi$. We consider several partial differential equations (PDEs) whose solution is a minimiser of the Kullback--Leibler divergence from $\pi$ and connect them to well-known Monte Carlo algorithms. We focus in particular on PDEs obtained by considering the Wasserstein--Fisher--Rao geometry over the space of probabilities and show that these lead to a natural implementation using importance sampling and sequential Monte Carlo. We propose a novel algorithm to approximate the Wasserstein--Fisher--Rao flow of the Kullback--Leibler divergence which empirically outperforms the current state-of-the-art. We study tempered versions of these PDEs obtained by replacing the target distribution with a geometric mixture of initial and target distribution and show that these do not lead to a convergence speed up.

Country of Origin
🇦🇺 🇮🇹 Italy, Australia

Page Count
41 pages

Category
Statistics:
Methodology