Guaranteed Noisy CP Tensor Recovery via Riemannian Optimization on the Segre Manifold
By: Ke Xu, Yuefeng Han
Potential Business Impact:
Finds hidden patterns in messy data.
Recovering a low-CP-rank tensor from noisy linear measurements is a central challenge in high-dimensional data analysis, with applications spanning tensor PCA, tensor regression, and beyond. We exploit the intrinsic geometry of rank-one tensors by casting the recovery task as an optimization problem over the Segre manifold, the smooth Riemannian manifold of rank-one tensors. This geometric viewpoint yields two powerful algorithms: Riemannian Gradient Descent (RGD) and Riemannian Gauss-Newton (RGN), each of which preserves feasibility at every iteration. Under mild noise assumptions, we prove that RGD converges at a local linear rate, while RGN exhibits an initial local quadratic convergence phase that transitions to a linear rate as the iterates approach the statistical noise floor. Extensive synthetic experiments validate these convergence guarantees and demonstrate the practical effectiveness of our methods.
Similar Papers
Riemannian Optimization for Distance Geometry: A Study of Convergence, Robustness, and Incoherence
Optimization and Control
Finds hidden shapes from incomplete distance clues.
Guaranteed Nonconvex Low-Rank Tensor Estimation via Scaled Gradient Descent
Machine Learning (Stat)
Cleans messy data to find hidden patterns faster.
A Scalable Factorization Approach for High-Order Structured Tensor Recovery
Machine Learning (CS)
Makes complex math problems solve faster and easier.