General Divergence Regularized Optimal Transport: Sample Complexity and Central Limit Theorems
By: Jiaping Yang, Yunxin Zhang
Potential Business Impact:
Makes complex math work better in big problems.
Optimal transport has emerged as a fundamental methodology with applications spanning multiple research areas in recent years. However, the convergence rate of the empirical estimator to its population counterpart suffers from the curse of dimensionality, which prevents its application in high-dimensional spaces. While entropic regularization has been proven to effectively mitigate the curse of dimensionality and achieve a parametric convergence rate under mild conditions, these statistical guarantees have not been extended to general regularizers. Our work bridges this gap by establishing analogous results for a broader family of regularizers. Specifically, under boundedness constraints, we prove a convergence rate of order $n^{-1/2} with respect to sample size n. Furthermore, we derive several central limit theorems for divergence regularized optimal transport.
Similar Papers
Sparse Regularized Optimal Transport without Curse of Dimensionality
Statistics Theory
Makes computer math work better, no matter how big.
A Truncated Newton Method for Optimal Transport
Machine Learning (CS)
Makes computers solve hard math problems faster.
Sample complexity for entropic optimal transport with radial cost
Statistics Theory
Helps computers move data more efficiently.