Score: 0

Guaranteed Noisy CP Tensor Recovery via Riemannian Optimization on the Segre Manifold

Published: October 1, 2025 | arXiv ID: 2510.00569v1

By: Ke Xu, Yuefeng Han

Potential Business Impact:

Finds hidden patterns in messy data.

Business Areas:
A/B Testing Data and Analytics

Recovering a low-CP-rank tensor from noisy linear measurements is a central challenge in high-dimensional data analysis, with applications spanning tensor PCA, tensor regression, and beyond. We exploit the intrinsic geometry of rank-one tensors by casting the recovery task as an optimization problem over the Segre manifold, the smooth Riemannian manifold of rank-one tensors. This geometric viewpoint yields two powerful algorithms: Riemannian Gradient Descent (RGD) and Riemannian Gauss-Newton (RGN), each of which preserves feasibility at every iteration. Under mild noise assumptions, we prove that RGD converges at a local linear rate, while RGN exhibits an initial local quadratic convergence phase that transitions to a linear rate as the iterates approach the statistical noise floor. Extensive synthetic experiments validate these convergence guarantees and demonstrate the practical effectiveness of our methods.

Country of Origin
🇺🇸 United States

Page Count
33 pages

Category
Statistics:
Machine Learning (Stat)