Score: 1

An operator splitting analysis of Wasserstein--Fisher--Rao gradient flows

Published: November 22, 2025 | arXiv ID: 2511.18060v1

By: Francesca Romana Crucinio, Sahani Pathiraja

Potential Business Impact:

Makes computer learning faster by mixing two math tricks.

Business Areas:
A/B Testing Data and Analytics

Wasserstein-Fisher-Rao (WFR) gradient flows have been recently proposed as a powerful sampling tool that combines the advantages of pure Wasserstein (W) and pure Fisher-Rao (FR) gradient flows. Existing algorithmic developments implicitly make use of operator splitting techniques to numerically approximate the WFR partial differential equation, whereby the W flow is evaluated over a given step size and then the FR flow (or vice versa). This works investigates the impact of the order in which the W and FR operator are evaluated and aims to provide a quantitative analysis. Somewhat surprisingly, we show that with a judicious choice of step size and operator ordering, the split scheme can converge to the target distribution faster than the exact WFR flow (in terms of model time). We obtain variational formulae describing the evolution over one time step of both sequential splitting schemes and investigate in which settings the W-FR split should be preferred to the FR-W split. As a step towards this goal we show that the WFR gradient flow preserves log-concavity and obtain the first sharp decay bound for WFR.

Country of Origin
🇮🇹 🇦🇺 Australia, Italy

Page Count
39 pages

Category
Statistics:
Machine Learning (Stat)