Nonparametric estimation of conditional probability distributions using a generative approach based on conditional push-forward neural networks
By: Nicola Rares Franco, Lorenzo Tedesco
Potential Business Impact:
Helps computers learn from examples better.
We introduce conditional push-forward neural networks (CPFN), a generative framework for conditional distribution estimation. Instead of directly modeling the conditional density $f_{Y|X}$, CPFN learns a stochastic map $\varphi=\varphi(x,u)$ such that $\varphi(x,U)$ and $Y|X=x$ follow approximately the same law, with $U$ a suitable random vector of pre-defined latent variables. This enables efficient conditional sampling and straightforward estimation of conditional statistics through Monte Carlo methods. The model is trained via an objective function derived from a Kullback-Leibler formulation, without requiring invertibility or adversarial training. We establish a near-asymptotic consistency result and demonstrate experimentally that CPFN can achieve performance competitive with, or even superior to, state-of-the-art methods, including kernel estimators, tree-based algorithms, and popular deep learning techniques, all while remaining lightweight and easy to train.
Similar Papers
Nonparametric estimation of conditional probability distributions using a generative approach based on conditional push-forward neural networks
Machine Learning (CS)
Makes computers guess answers better from clues.
Federated Conditional Conformal Prediction via Generative Models
Machine Learning (CS)
Helps AI learn from different data safely.
Accelerated Execution of Bayesian Neural Networks using a Single Probabilistic Forward Pass and Code Generation
Machine Learning (CS)
Makes AI safer by knowing when it's wrong.