Linear Discriminant Analysis with Gradient Optimization on Covariance Inverse
By: Cencheng Shen, Yuexiao Dong
Potential Business Impact:
Improves computer sorting of complex data.
Linear discriminant analysis (LDA) is a fundamental method in statistical pattern recognition and classification, achieving Bayes optimality under Gaussian assumptions. However, it is well-known that classical LDA may struggle in high-dimensional settings due to instability in covariance estimation. In this work, we propose LDA with gradient optimization (LDA-GO), a new approach that directly optimizes the inverse covariance matrix via gradient descent. The algorithm parametrizes the inverse covariance matrix through Cholesky factorization, incorporates a low-rank extension to reduce computational complexity, and considers a multiple-initialization strategy, including identity initialization and warm-starting from the classical LDA estimates. The effectiveness of LDA-GO is demonstrated through extensive multivariate simulations and real-data experiments.
Similar Papers
A Convex formulation for linear discriminant analysis
Machine Learning (CS)
Helps computers sort data better, even with lots of info.
Spatial Sign based Direct Sparse Linear Discriminant Analysis for High Dimensional Data
Methodology
Helps computers sort data better, even when it's messy.
Non-Asymptotic Analysis of Data Augmentation for Precision Matrix Estimation
Machine Learning (Stat)
Helps computers learn better from more data.