A Riemannian Gradient Descent Method for the Least Squares Inverse Eigenvalue Problem
By: Alban Bloor Riley, Marcus Webb, Michael L. Baker
Potential Business Impact:
Finds hidden patterns in big, complex data.
We address an algorithm for the least squares fitting of a subset of the eigenvalues of an unknown Hermitian matrix lying an an affine subspace, called the Lift and Projection (LP) method, due to Chen and Chu (SIAM Journal on Numerical Analysis, 33 (1996), pp.2417-2430). The LP method iteratively `lifts' the current iterate onto the spectral constraint manifold then 'projects' onto the solution's affine subspace. We prove that this is equivalent to a Riemannian Gradient Descent with respect to a natural Riemannian metric. This insight allows us to derive a more efficient implementation, analyse more precisely its global convergence properties, and naturally append additional constraints to the problem. We provide several numerical experiments to demonstrate the improvement in computation time, which can be more than an order of magnitude if the eigenvalue constraints are on the smallest eigenvalues, the largest eigenvalues, or the eigenvalues closest to a given number. These experiments include an inverse eigenvalue problem arising in Inelastic Neutron Scattering of Manganese-6, which requires the least squares fitting of 16 experimentally observed eigenvalues of a $32400\times32400$ sparse matrix from a 5-dimensional subspace of spin Hamiltonian matrices.
Similar Papers
A Ritz method for solution of parametric generalized EVPs
Numerical Analysis
Finds important numbers in complex math problems faster.
Eigen-inference by Marchenko-Pastur inversion
Statistics Theory
Finds hidden patterns in big, messy data.
An efficient algorithm for the minimal least squares solution of linear system with indefinite symmetric matrices
Numerical Analysis
Solves math problems faster and more accurately.