Evaluating Singular Value Thresholds for DNN Weight Matrices based on Random Matrix Theory
By: Kohei Nishikawa, Koki Shimizu, Hashiguchi Hiroki
This study evaluates thresholds for removing singular values from singular value decomposition-based low-rank approximations of deep neural network weight matrices. Each weight matrix is modeled as the sum of signal and noise matrices. The low-rank approximation is obtained by removing noise-related singular values using a threshold based on random matrix theory. To assess the adequacy of this threshold, we propose an evaluation metric based on the cosine similarity between the singular vectors of the signal and original weight matrices. The proposed metric is used in numerical experiments to compare two threshold estimation methods.
Similar Papers
Auto Tensor Singular Value Thresholding: A Non-Iterative and Rank-Free Framework for Tensor Denoising
Machine Learning (CS)
Cleans messy data better and faster.
Near-optimal Rank Adaptive Inference of High Dimensional Matrices
Information Theory
Finds hidden patterns in messy data.
PCA recovery thresholds in low-rank matrix inference with sparse noise
Machine Learning (Stat)
Finds hidden patterns in messy data.