Score: 0

Convergence and Optimality of the EM Algorithm Under Multi-Component Gaussian Mixture Models

Published: September 10, 2025 | arXiv ID: 2509.08237v1

By: Xin Bing, Dehan Kong, Bingqing Li

Potential Business Impact:

Helps computers find hidden patterns in messy data.

Business Areas:
A/B Testing Data and Analytics

Gaussian mixture models (GMMs) are fundamental statistical tools for modeling heterogeneous data. Due to the nonconcavity of the likelihood function, the Expectation-Maximization (EM) algorithm is widely used for parameter estimation of each Gaussian component. Existing analyses of the EM algorithm's convergence to the true parameter focus primarily on either the two-component case or multi-component settings with both known mixing probabilities and known, isotropic covariance matrices. In this work, we establish the minimax optimal rate of convergence of the EM algorithm for multi-component GMMs in full generality. The required separation condition between Gaussian components for EM to converge is the weakest known to date. We develop two distinct analytical approaches, each tailored to a different regime of separation, reflecting two complementary perspectives on the use of EM: parameter estimation and clustering. As a byproduct of our analysis, we show that the EM algorithm, when used for community detection, also achieves the minimax optimal rate of misclustering error under milder separation conditions than spectral clustering and Lloyd's algorithm, an interesting result in its own right. Our analysis allows the number of components, the minimal mixing probabilities, the separation between Gaussian components as well as the dimension to grow with the sample size. Simulation studies corroborate the theoretical findings.

Country of Origin
🇨🇦 Canada

Page Count
85 pages

Category
Mathematics:
Statistics Theory