Contractive kinetic Langevin samplers beyond global Lipschitz continuity
By: Iosif Lytras, Panagiotis Mertikopoulos
Potential Business Impact:
Makes computer models learn faster and more accurately.
In this paper, we examine the problem of sampling from log-concave distributions with (possibly) superlinear gradient growth under kinetic (underdamped) Langevin algorithms. Using a carefully tailored taming scheme, we propose two novel discretizations of the kinetic Langevin SDE, and we show that they are both contractive and satisfy a log-Sobolev inequality. Building on this, we establish a series of non-asymptotic bounds in $2$-Wasserstein distance between the law reached by each algorithm and the underlying target measure.
Similar Papers
The Picard-Lagrange Framework for Higher-Order Langevin Monte Carlo
Statistics Theory
Makes computer learning faster and more accurate.
Operator-Level Quantum Acceleration of Non-Logconcave Sampling
Quantum Physics
Quantum computers solve tough problems faster.
When Langevin Monte Carlo Meets Randomization: Non-asymptotic Error Bounds beyond Log-Concavity and Gradient Lipschitzness
Machine Learning (Stat)
Makes computer models work better for hard problems.