Score: 2

An Efficient Massively Parallel Constant-Factor Approximation Algorithm for the $k$-Means Problem

Published: July 18, 2025 | arXiv ID: 2507.14089v1

By: Vincent Cohen-Addad, Fabian Kuhn, Zahra Parsaeian

BigTech Affiliations: Google

Potential Business Impact:

Makes computers group data much faster.

Business Areas:
Fast-Moving Consumer Goods Consumer Goods, Real Estate

In this paper, we present an efficient massively parallel approximation algorithm for the $k$-means problem. Specifically, we provide an MPC algorithm that computes a constant-factor approximation to an arbitrary $k$-means instance in $O(\log\log n \cdot \log\log\log n)$ rounds. The algorithm uses $O(n^\sigma)$ bits of memory per machine, where $\sigma > 0$ is a constant that can be made arbitrarily small. The global memory usage is $O(n^{1+\varepsilon})$ bits for an arbitrarily small constant $\varepsilon > 0$, and is thus only slightly superlinear. Recently, Czumaj, Gao, Jiang, Krauthgamer, and Vesel\'{y} showed that a constant-factor bicriteria approximation can be computed in $O(1)$ rounds in the MPC model. However, our algorithm is the first constant-factor approximation for the general $k$-means problem that runs in $o(\log n)$ rounds in the MPC model. Our approach builds upon the foundational framework of Jain and Vazirani. The core component of our algorithm is a constant-factor approximation for the related facility location problem. While such an approximation was already achieved in constant time in the work of Czumaj et al.\ mentioned above, our version additionally satisfies the so-called Lagrangian Multiplier Preserving (LMP) property. This property enables the transformation of a facility location approximation into a comparably good $k$-means approximation.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡©πŸ‡ͺ Germany, United States

Page Count
55 pages

Category
Computer Science:
Data Structures and Algorithms