Score: 2

Nonparametric estimation of conditional probability distributions using a generative approach based on conditional push-forward neural networks

Published: November 18, 2025 | arXiv ID: 2511.14455v1

By: Nicola Rares Franco, Lorenzo Tedesco

Potential Business Impact:

Makes computers guess answers better from clues.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

We introduce conditional push-forward neural networks (CPFN), a generative framework for conditional distribution estimation. Instead of directly modeling the conditional density $f_{Y|X}$, CPFN learns a stochastic map $\varphi=\varphi(x,u)$ such that $\varphi(x,U)$ and $Y|X=x$ follow approximately the same law, with $U$ a suitable random vector of pre-defined latent variables. This enables efficient conditional sampling and straightforward estimation of conditional statistics through Monte Carlo methods. The model is trained via an objective function derived from a Kullback-Leibler formulation, without requiring invertibility or adversarial training. We establish a near-asymptotic consistency result and demonstrate experimentally that CPFN can achieve performance competitive with, or even superior to, state-of-the-art methods, including kernel estimators, tree-based algorithms, and popular deep learning techniques, all while remaining lightweight and easy to train.

Repos / Data Links

Page Count
38 pages

Category
Computer Science:
Machine Learning (CS)