Poisson Midpoint Method for Log Concave Sampling: Beyond the Strong Error Lower Bounds
By: Rishikesh Srinivasan, Dheeraj Nagaraj
Potential Business Impact:
Speeds up computer learning by 3x.
We study the problem of sampling from strongly log-concave distributions over $\mathbb{R}^d$ using the Poisson midpoint discretization (a variant of the randomized midpoint method) for overdamped/underdamped Langevin dynamics. We prove its convergence in the 2-Wasserstein distance ($W_2$), achieving a cubic speedup in dependence on the target accuracy ($\epsilon$) over the Euler-Maruyama discretization, surpassing existing bounds for randomized midpoint methods. Notably, in the case of underdamped Langevin dynamics, we demonstrate the complexity of $W_2$ convergence is much smaller than the complexity lower bounds for convergence in $L^2$ strong error established in the literature.
Similar Papers
The Picard-Lagrange Framework for Higher-Order Langevin Monte Carlo
Statistics Theory
Makes computer learning faster and more accurate.
Analysis of Langevin midpoint methods using an anticipative Girsanov theorem
Numerical Analysis
Makes computer sampling faster and more accurate.
When Langevin Monte Carlo Meets Randomization: Non-asymptotic Error Bounds beyond Log-Concavity and Gradient Lipschitzness
Machine Learning (Stat)
Makes computer models work better for hard problems.