Near-optimal delta-convex estimation of Lipschitz functions
By: Gábor Balázs
Potential Business Impact:
Finds hidden patterns in messy data.
This paper presents a tractable algorithm for estimating an unknown Lipschitz function from noisy observations and establishes an upper bound on its convergence rate. The approach extends max-affine methods from convex shape-restricted regression to the more general Lipschitz setting. A key component is a nonlinear feature expansion that maps max-affine functions into a subclass of delta-convex functions, which act as universal approximators of Lipschitz functions while preserving their Lipschitz constants. Leveraging this property, the estimator attains the minimax convergence rate (up to logarithmic factors) with respect to the intrinsic dimension of the data under squared loss and subgaussian distributions in the random design setting. The algorithm integrates adaptive partitioning to capture intrinsic dimension, a penalty-based regularization mechanism that removes the need to know the true Lipschitz constant, and a two-stage optimization procedure combining a convex initialization with local refinement. The framework is also straightforward to adapt to convex shape-restricted regression. Experiments demonstrate competitive performance relative to other theoretically justified methods, including nearest-neighbor and kernel-based regressors.
Similar Papers
Learning and Testing Convex Functions
Data Structures and Algorithms
Teaches computers to understand smooth, curvy math rules.
Improving Online-to-Nonconvex Conversion for Smooth Optimization via Double Optimism
Optimization and Control
Finds better solutions to hard math problems faster.
An Adaptive Sampling Algorithm for Level-set Approximation
Numerical Analysis
Find shapes in messy data faster.