An Efficient Alternating Algorithm for ReLU-based Symmetric Matrix Decomposition
By: Qingsong Wang
Potential Business Impact:
Finds hidden patterns in data faster.
Symmetric matrix decomposition is an active research area in machine learning. This paper focuses on exploiting the low-rank structure of non-negative and sparse symmetric matrices via the rectified linear unit (ReLU) activation function. We propose the ReLU-based nonlinear symmetric matrix decomposition (ReLU-NSMD) model, introduce an accelerated alternating partial Bregman (AAPB) method for its solution, and present the algorithm's convergence results. Our algorithm leverages the Bregman proximal gradient framework to overcome the challenge of estimating the global $L$-smooth constant in the classic proximal gradient algorithm. Numerical experiments on synthetic and real datasets validate the effectiveness of our model and algorithm.
Similar Papers
An Accelerated Alternating Partial Bregman Algorithm for ReLU-based Matrix Decomposition
Machine Learning (CS)
Finds hidden patterns in messy data.
Efficient algorithms for the Hadamard decomposition
Machine Learning (CS)
Makes big data smaller and easier to use.
Low-Rank Matrix Approximation for Neural Network Compression
Machine Learning (CS)
Makes smart computer programs run faster and smaller.