Deterministic Coreset Construction via Adaptive Sensitivity Trimming
By: Faruk Alpay, Taylan Alpay
Potential Business Impact:
Makes computer learning faster and more accurate.
We develop a rigorous framework for deterministic coreset construction in empirical risk minimization (ERM). Our central contribution is the Adaptive Deterministic Uniform-Weight Trimming (ADUWT) algorithm, which constructs a coreset by excising points with the lowest sensitivity bounds and applying a data-dependent uniform weight to the remainder. The method yields a uniform $(1\pm\varepsilon)$ relative-error approximation for the ERM objective over the entire hypothesis space. We provide complete analysis, including (i) a minimax characterization proving the optimality of the adaptive weight, (ii) an instance-dependent size analysis in terms of a \emph{Sensitivity Heterogeneity Index}, and (iii) tractable sensitivity oracles for kernel ridge regression, regularized logistic regression, and linear SVM. Reproducibility is supported by precise pseudocode for the algorithm, sensitivity oracles, and evaluation pipeline. Empirical results align with the theory. We conclude with open problems on instance-optimal oracles, deterministic streaming, and fairness-constrained ERM.
Similar Papers
The Easy Path to Robustness: Coreset Selection using Sample Hardness
Machine Learning (CS)
Makes AI smarter and safer from tricks.
Adaptive Individual Uncertainty under Out-Of-Distribution Shift with Expert-Routed Conformal Prediction
Machine Learning (CS)
Helps drug discovery AI know when it's wrong.
Coresets for Clustering Under Stochastic Noise
Machine Learning (CS)
Cleans messy data for better computer learning.