No Eigenvalues Outside the Limiting Support of Generally Correlated and Noncentral Sample Covariance Matrices
By: Zeyan Zhuang , Xin Zhang , Dongfang Xu and more
Potential Business Impact:
Improves wireless signals in crowded areas.
Spectral properties of random matrices play an important role in statistics, machine learning, communications, and many other areas. Engaging results regarding the convergence of the empirical spectral distribution (ESD) and the ``no-eigenvalue'' property have been obtained for random matrices with different correlation structures. However, the related spectral analysis for generally correlated and noncentral random matrices is still incomplete, and this paper aims to fill this research gap. Specifically, we consider matrices whose columns are independent but with non-zero means and non-identical correlations. Under high-dimensional asymptotics where both the number of rows and columns grow simultaneously to infinity, we first establish the almost sure convergence of the ESD for the concerned random matrices to a deterministic limit, assuming mild conditions. Furthermore, we prove that with probability 1, no eigenvalues will appear in any closed interval outside the support of the limiting distribution for matrices with sufficiently large dimensions. The above results can be applied to different areas such as statistics, wireless communications, and signal processing. In this paper, we apply the derived results to two communication scenarios: 1) We determine the limiting performance of the signal-to-interference-plus-noise ratio for multi-user multiple-input multiple-output (MIMO) systems with linear minimum mean-square error receivers; and 2) We establish the invertibility of zero-forcing precoding matrices in downlink MIMO systems, providing theoretical guarantees.
Similar Papers
On eigenvalues of a renormalized sample correlation matrix
Statistics Theory
Finds if data is related, even with lots of info.
Improved dependence on coherence in eigenvector and eigenvalue estimation error bounds
Statistics Theory
Finds hidden patterns in messy data better.
Spectral analysis of spatial-sign covariance matrices for heavy-tailed data with dependence
Statistics Theory
Helps computers understand messy data better.