Coresets for Clustering Under Stochastic Noise
By: Lingxiao Huang , Zhize Li , Nisheeth K. Vishnoi and more
Potential Business Impact:
Cleans messy data for better computer learning.
We study the problem of constructing coresets for $(k, z)$-clustering when the input dataset is corrupted by stochastic noise drawn from a known distribution. In this setting, evaluating the quality of a coreset is inherently challenging, as the true underlying dataset is unobserved. To address this, we investigate coreset construction using surrogate error metrics that are tractable and provably related to the true clustering cost. We analyze a traditional metric from prior work and introduce a new error metric that more closely aligns with the true cost. Although our metric is defined independently of the noise distribution, it enables approximation guarantees that scale with the noise level. We design a coreset construction algorithm based on this metric and show that, under mild assumptions on the data and noise, enforcing an $\varepsilon$-bound under our metric yields smaller coresets and tighter guarantees on the true clustering cost than those obtained via classical metrics. In particular, we prove that the coreset size can improve by a factor of up to $\mathrm{poly}(k)$, where $n$ is the dataset size. Experiments on real-world datasets support our theoretical findings and demonstrate the practical advantages of our approach.
Similar Papers
Linear time small coresets for k-mean clustering of segments with applications
Machine Learning (CS)
Groups shapes in videos faster and more accurately.
Linear time small coresets for k-mean clustering of segments with applications
Machine Learning (CS)
Groups shapes in data much faster.
Coreset for Robust Geometric Median: Eliminating Size Dependency on Outliers
Data Structures and Algorithms
Makes data summaries better by ignoring bad data.