Minimax Optimal Rates for Regression on Manifolds and Distributions
By: Rong Tang, Yun Yang
Potential Business Impact:
Learns patterns from messy, incomplete data.
Distribution regression seeks to estimate the conditional distribution of a multivariate response given a continuous covariate. This approach offers a more complete characterization of dependence than traditional regression methods. Classical nonparametric techniques often assume that the conditional distribution has a well-defined density, an assumption that fails in many real-world settings. These include cases where data contain discrete elements or lie on complex low-dimensional structures within high-dimensional spaces. In this work, we establish minimax convergence rates for distribution regression under nonparametric assumptions, focusing on scenarios where both covariates and responses lie on low-dimensional manifolds. We derive lower bounds that capture the inherent difficulty of the problem and propose a new hybrid estimator that combines adversarial learning with simultaneous least squares to attain matching upper bounds. Our results reveal how the smoothness of the conditional distribution and the geometry of the underlying manifolds together determine the estimation accuracy.
Similar Papers
Adversarial learning for nonparametric regression: Minimax rate and adaptive estimation
Machine Learning (Stat)
Protects computers from tricky, fake data.
Risk Bounds For Distributional Regression
Machine Learning (Stat)
Helps predict outcomes more accurately.
Minimax Rates for the Estimation of Eigenpairs of Weighted Laplace-Beltrami Operators on Manifolds
Machine Learning (Stat)
Helps computers find hidden patterns in data.