Score: 0

Accelerated Regularized Wasserstein Proximal Sampling Algorithms

Published: January 14, 2026 | arXiv ID: 2601.09848v2

By: Hong Ye Tan, Stanley Osher, Wuchen Li

Potential Business Impact:

Speeds up computer learning for better predictions.

Business Areas:
A/B Testing Data and Analytics

We consider sampling from a Gibbs distribution by evolving a finite number of particles using a particular score estimator rather than Brownian motion. To accelerate the particles, we consider a second-order score-based ODE, similar to Nesterov acceleration. In contrast to traditional kernel density score estimation, we use the recently proposed regularized Wasserstein proximal method, yielding the Accelerated Regularized Wasserstein Proximal method (ARWP). We provide a detailed analysis of continuous- and discrete-time non-asymptotic and asymptotic mixing rates for Gaussian initial and target distributions, using techniques from Euclidean acceleration and accelerated information gradients. Compared with the kinetic Langevin sampling algorithm, the proposed algorithm exhibits a higher contraction rate in the asymptotic time regime. Numerical experiments are conducted across various low-dimensional experiments, including multi-modal Gaussian mixtures and ill-conditioned Rosenbrock distributions. ARWP exhibits structured and convergent particles, accelerated discrete-time mixing, and faster tail exploration than the non-accelerated regularized Wasserstein proximal method and kinetic Langevin methods. Additionally, ARWP particles exhibit better generalization properties for some non-log-concave Bayesian neural network tasks.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
39 pages

Category
Statistics:
Machine Learning (Stat)