Score: 0

A Smoothing Newton Method for Rank-one Matrix Recovery

Published: July 30, 2025 | arXiv ID: 2507.23017v1

By: Tyler Maunu, Gabriel Abreu

Potential Business Impact:

Fixes a computer math trick for better results.

Business Areas:
A/B Testing Data and Analytics

We consider the phase retrieval problem, which involves recovering a rank-one positive semidefinite matrix from rank-one measurements. A recently proposed algorithm based on Bures-Wasserstein gradient descent (BWGD) exhibits superlinear convergence, but it is unstable, and existing theory can only prove local linear convergence for higher rank matrix recovery. We resolve this gap by revealing that BWGD implements Newton's method with a nonsmooth and nonconvex objective. We develop a smoothing framework that regularizes the objective, enabling a stable method with rigorous superlinear convergence guarantees. Experiments on synthetic data demonstrate this superior stability while maintaining fast convergence.

Page Count
12 pages

Category
Statistics:
Machine Learning (Stat)