Beyond Linear Diffusions: Improved Representations for Rare Conditional Generative Modeling
By: Kulunu Dharmakeerthi , Yousef El-Laham , Henry H. Wong and more
Potential Business Impact:
Models rare events better than before.
Diffusion models have emerged as powerful generative frameworks with widespread applications across machine learning and artificial intelligence systems. While current research has predominantly focused on linear diffusions, these approaches can face significant challenges when modeling a conditional distribution, $P(Y|X=x)$, when $P(X=x)$ is small. In these regions, few samples, if any, are available for training, thus modeling the corresponding conditional density may be difficult. Recognizing this, we show it is possible to adapt the data representation and forward scheme so that the sample complexity of learning a score-based generative model is small in low probability regions of the conditioning space. Drawing inspiration from conditional extreme value theory we characterize this method precisely in the special case in the tail regions of the conditioning variable, $X$. We show how diffusion with a data-driven choice of nonlinear drift term is best suited to model tail events under an appropriate representation of the data. Through empirical validation on two synthetic datasets and a real-world financial dataset, we demonstrate that our tail-adaptive approach significantly outperforms standard diffusion models in accurately capturing response distributions at the extreme tail conditions.
Similar Papers
Non-asymptotic convergence bound of conditional diffusion models
Machine Learning (Stat)
Helps AI learn and create data more accurately.
The Principles of Diffusion Models
Machine Learning (CS)
Creates new pictures and sounds from noise.
MAD: Manifold Attracted Diffusion
Machine Learning (Stat)
Makes blurry pictures sharp and clear.