Score: 0

General Divergence Regularized Optimal Transport: Sample Complexity and Central Limit Theorems

Published: October 2, 2025 | arXiv ID: 2510.02489v1

By: Jiaping Yang, Yunxin Zhang

Potential Business Impact:

Makes complex math work better in big problems.

Business Areas:
Last Mile Transportation Transportation

Optimal transport has emerged as a fundamental methodology with applications spanning multiple research areas in recent years. However, the convergence rate of the empirical estimator to its population counterpart suffers from the curse of dimensionality, which prevents its application in high-dimensional spaces. While entropic regularization has been proven to effectively mitigate the curse of dimensionality and achieve a parametric convergence rate under mild conditions, these statistical guarantees have not been extended to general regularizers. Our work bridges this gap by establishing analogous results for a broader family of regularizers. Specifically, under boundedness constraints, we prove a convergence rate of order $n^{-1/2} with respect to sample size n. Furthermore, we derive several central limit theorems for divergence regularized optimal transport.

Country of Origin
🇨🇳 China

Page Count
21 pages

Category
Mathematics:
Statistics Theory