On Conditional Stochastic Interpolation for Generative Nonlinear Sufficient Dimension Reduction
By: Shuntuo Xu, Zhou Yu, Jian Huang
Identifying low-dimensional sufficient structures in nonlinear sufficient dimension reduction (SDR) has long been a fundamental yet challenging problem. Most existing methods lack theoretical guarantees of exhaustiveness in identifying lower dimensional structures, either at the population level or at the sample level. We tackle this issue by proposing a new method, generative sufficient dimension reduction (GenSDR), which leverages modern generative models. We show that GenSDR is able to fully recover the information contained in the central $σ$-field at both the population and sample levels. In particular, at the sample level, we establish a consistency property for the GenSDR estimator from the perspective of conditional distributions, capitalizing on the distributional learning capabilities of deep generative models. Moreover, by incorporating an ensemble technique, we extend GenSDR to accommodate scenarios with non-Euclidean responses, thereby substantially broadening its applicability. Extensive numerical results demonstrate the outstanding empirical performance of GenSDR and highlight its strong potential for addressing a wide range of complex, real-world tasks.
Similar Papers
Unified Distributed Estimation Framework for Sufficient Dimension Reduction Based on Conditional Moments
Methodology
Lets computers learn from data spread everywhere.
Subspace Ordering for Maximum Response Preservation in Sufficient Dimension Reduction
Methodology
Finds better ways to understand data for predictions.
A general framework for adaptive nonparametric dimensionality reduction
Machine Learning (Stat)
Finds best way to show complex data simply.