Computing Wasserstein Barycenters through Gradient Flows
By: Eduardo Fernandes Montesuma, Yassir Bendou, Mike Gartrell
Potential Business Impact:
Finds the average of data faster and better.
Wasserstein barycenters provide a powerful tool for aggregating probability measures, while leveraging the geometry of their ambient space. Existing discrete methods suffer from poor scalability, as they require access to the complete set of samples from input measures. We address this issue by recasting the original barycenter problem as a gradient flow in the Wasserstein space. Our approach offers two advantages. First, we achieve scalability by sampling mini-batches from the input measures. Second, we incorporate functionals over probability measures, which regularize the barycenter problem through internal, potential, and interaction energies. We present two algorithms for empirical and Gaussian mixture measures, providing convergence guarantees under the Polyak-{\L}ojasiewicz inequality. Experimental validation on toy datasets and domain adaptation benchmarks show that our methods outperform previous discrete and neural net-based methods for computing Wasserstein barycenters.
Similar Papers
A Particle-Flow Algorithm for Free-Support Wasserstein Barycenters
Machine Learning (Stat)
Finds the average of complex data patterns.
A Particle-Flow Algorithm for Free-Support Wasserstein Barycenters
Machine Learning (Stat)
Finds the average of many data shapes.
Differentially Private Wasserstein Barycenters
Machine Learning (CS)
Keeps private data safe when finding averages.