Score: 0

Latent Nonlinear Denoising Score Matching for Enhanced Learning of Structured Distributions

Published: December 7, 2025 | arXiv ID: 2512.06615v1

By: Kaichen Shen, Wei Zhu

Potential Business Impact:

Makes AI create better, more varied pictures faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We present latent nonlinear denoising score matching (LNDSM), a novel training objective for score-based generative models that integrates nonlinear forward dynamics with the VAE-based latent SGM framework. This combination is achieved by reformulating the cross-entropy term using the approximate Gaussian transition induced by the Euler-Maruyama scheme. To ensure numerical stability, we identify and remove two zero-mean but variance exploding terms arising from small time steps. Experiments on variants of the MNIST dataset demonstrate that the proposed method achieves faster synthesis and enhanced learning of inherently structured distributions. Compared to benchmark structure-agnostic latent SGMs, LNDSM consistently attains superior sample quality and variability.

Country of Origin
🇺🇸 United States

Page Count
20 pages

Category
Statistics:
Machine Learning (Stat)