Revisit CP Tensor Decomposition: Statistical Optimality and Fast Convergence
By: Runshi Tang , Julien Chhor , Olga Klopp and more
Potential Business Impact:
Makes computer analysis of complex data more accurate.
Canonical Polyadic (CP) tensor decomposition is a fundamental technique for analyzing high-dimensional tensor data. While the Alternating Least Squares (ALS) algorithm is widely used for computing CP decomposition due to its simplicity and empirical success, its theoretical foundation, particularly regarding statistical optimality and convergence behavior, remain underdeveloped, especially in noisy, non-orthogonal, and higher-rank settings. In this work, we revisit CP tensor decomposition from a statistical perspective and provide a comprehensive theoretical analysis of ALS under a signal-plus-noise model. We establish non-asymptotic, minimax-optimal error bounds for tensors of general order, dimensions, and rank, assuming suitable initialization. To enable such initialization, we propose Tucker-based Approximation with Simultaneous Diagonalization (TASD), a robust method that improves stability and accuracy in noisy regimes. Combined with ALS, TASD yields a statistically consistent estimator. We further analyze the convergence dynamics of ALS, identifying a two-phase pattern-initial quadratic convergence followed by linear refinement. We further show that in the rank-one setting, ALS with an appropriately chosen initialization attains optimal error within just one or two iterations.
Similar Papers
Convergence of the alternating least squares algorithm for CP tensor decompositions
Numerical Analysis
Makes computer math faster and more reliable.
Efficient QR-Based CP Decomposition Acceleration via Dimension Tree and Extrapolation
Numerical Analysis
Makes computer math faster and more accurate.
A Generating Polynomial Based Two-Stage Optimization Method for Tensor Rank Decomposition
Optimization and Control
Solves hard math problems for computers.