Score: 0

Particle Dynamics for Latent-Variable Energy-Based Models

Published: October 17, 2025 | arXiv ID: 2510.15447v1

By: Shiqin Tang , Shuxin Zhuang , Rong Feng and more

Potential Business Impact:

Teaches computers to learn hidden patterns from data.

Business Areas:
A/B Testing Data and Analytics

Latent-variable energy-based models (LVEBMs) assign a single normalized energy to joint pairs of observed data and latent variables, offering expressive generative modeling while capturing hidden structure. We recast maximum-likelihood training as a saddle problem over distributions on the latent and joint manifolds and view the inner updates as coupled Wasserstein gradient flows. The resulting algorithm alternates overdamped Langevin updates for a joint negative pool and for conditional latent particles with stochastic parameter ascent, requiring no discriminator or auxiliary networks. We prove existence and convergence under standard smoothness and dissipativity assumptions, with decay rates in KL divergence and Wasserstein-2 distance. The saddle-point view further yields an ELBO strictly tighter than bounds obtained with restricted amortized posteriors. Our method is evaluated on numerical approximations of physical systems and performs competitively against comparable approaches.

Country of Origin
🇭🇰 Hong Kong

Page Count
13 pages

Category
Computer Science:
Machine Learning (CS)