Score-Based Training for Energy-Based TTS Models
By: Wanli Sun, Anton Ragni
Potential Business Impact:
Teaches computers to learn better from messy data.
Noise contrastive estimation (NCE) is a popular method for training energy-based models (EBM) with intractable normalisation terms. The key idea of NCE is to learn by comparing unnormalised log-likelihoods of the reference and noisy samples, thus avoiding explicitly computing normalisation terms. However, NCE critically relies on the quality of noisy samples. Recently, sliced score matching (SSM) has been popularised by closely related diffusion models (DM). Unlike NCE, SSM learns a gradient of log-likelihood, or score, by learning distribution of its projections on randomly chosen directions. However, both NCE and SSM disregard the form of log-likelihood function, which is problematic given that EBMs and DMs make use of first-order optimisation during inference. This paper proposes a new criterion that learns scores more suitable for first-order schemes. Experiments contrasts these approaches for training EBMs.
Similar Papers
Learning Energy-Based Models by Self-normalising the Likelihood
Machine Learning (CS)
Teaches computers to learn from less data.
Learning normalized image densities via dual score matching
Machine Learning (CS)
Teaches computers to understand pictures better.
Credible Uncertainty Quantification under Noise and System Model Mismatch
Signal Processing
Makes computer guesses more trustworthy and reliable.