A hierarchical entropy method for the delocalization of bias in high-dimensional Langevin Monte Carlo
By: Daniel Lacker, Fuzhong Zhou
Potential Business Impact:
Makes computer models more accurate for certain problems.
The unadjusted Langevin algorithm is widely used for sampling from complex high-dimensional distributions. It is well known to be biased, with the bias typically scaling linearly with the dimension when measured in squared Wasserstein distance. However, the recent paper of Chen et al. (2024) identifies an intriguing new delocalization effect: For a class of distributions with sparse interactions, the bias between low-dimensional marginals scales only with the lower dimension, not the full dimension. In this work, we strengthen the results of Chen et al. (2024) in the sparse interaction regime by removing a logarithmic factor, measuring distance in relative entropy (a.k.a. KL-divergence), and relaxing the strong log-concavity assumption. In addition, we expand the scope of the delocalization phenomenon by showing that it holds for a class of distributions with weak interactions. Our proofs are based on a hierarchical analysis of the marginal relative entropies, inspired by the authors' recent work on propagation of chaos.
Similar Papers
Relative entropy estimate and geometric ergodicity for implicit Langevin Monte Carlo
Numerical Analysis
Makes computer simulations of tricky problems more reliable.
Estimation of discrete distributions in relative entropy, and the deviations of the missing mass
Statistics Theory
Finds hidden patterns in data more accurately.
A Hierarchical Decomposition of Kullback-Leibler Divergence: Disentangling Marginal Mismatches from Statistical Dependencies
Other Computer Science
Splits data differences into separate parts.