Stochastic Optimization in Semi-Discrete Optimal Transport: Convergence Analysis and Minimax Rate
By: Ferdinand Genans , Antoine Godichon-Baggioni , François-Xavier Vialard and more
Potential Business Impact:
Helps computers learn to move data better.
We investigate the semi-discrete Optimal Transport (OT) problem, where a continuous source measure $\mu$ is transported to a discrete target measure $\nu$, with particular attention to the OT map approximation. In this setting, Stochastic Gradient Descent (SGD) based solvers have demonstrated strong empirical performance in recent machine learning applications, yet their theoretical guarantee to approximate the OT map is an open question. In this work, we answer it positively by providing both computational and statistical convergence guarantees of SGD. Specifically, we show that SGD methods can estimate the OT map with a minimax convergence rate of $\mathcal{O}(1/\sqrt{n})$, where $n$ is the number of samples drawn from $\mu$. To establish this result, we study the averaged projected SGD algorithm, and identify a suitable projection set that contains a minimizer of the objective, even when the source measure is not compactly supported. Our analysis holds under mild assumptions on the source measure and applies to MTW cost functions,whic include $\|\cdot\|^p$ for $p \in (1, \infty)$. We finally provide numerical evidence for our theoretical results.
Similar Papers
Sharp Convergence Rates of Empirical Unbalanced Optimal Transport for Spatio-Temporal Point Processes
Statistics Theory
Measures how well data points match patterns.
A Truncated Newton Method for Optimal Transport
Machine Learning (CS)
Makes computers solve hard math problems faster.
Convex relaxation approaches for high-dimensional optimal transport
Optimization and Control
Makes complex math problems easier for computers.