LocalKMeans: Convergence of Lloyd's Algorithm with Distributed Local Iterations
By: Harsh Vardhan , Heng Zhu , Avishek Ghosh and more
Potential Business Impact:
Makes computer learning faster on many machines.
In this paper, we analyze the classical $K$-means alternating-minimization algorithm, also known as Lloyd's algorithm (Lloyd, 1956), for a mixture of Gaussians in a data-distributed setting that incorporates local iteration steps. Assuming unlabeled data distributed across multiple machines, we propose an algorithm, LocalKMeans, that performs Lloyd's algorithm in parallel in the machines by running its iterations on local data, synchronizing only every $L$ of such local steps. We characterize the cost of these local iterations against the non-distributed setting, and show that the price paid for the local steps is a higher required signal-to-noise ratio. While local iterations were theoretically studied in the past for gradient-based learning methods, the analysis of unsupervised learning methods is more involved owing to the presence of latent variables, e.g. cluster identities, than that of an iterative gradient-based algorithm. To obtain our results, we adapt a virtual iterate method to work with a non-convex, non-smooth objective function, in conjunction with a tight statistical analysis of Lloyd steps.
Similar Papers
Modified K-means Algorithm with Local Optimality Guarantees
Machine Learning (CS)
Makes computer groups more accurate and reliable.
An Observation on Lloyd's k-Means Algorithm in High Dimensions
Machine Learning (Stat)
Fixes computer grouping when data is messy.
Provably faster randomized and quantum algorithms for $k$-means clustering via uniform sampling
Quantum Physics
Speeds up sorting big data into groups.