Score: 0

An Efficient Alternating Algorithm for ReLU-based Symmetric Matrix Decomposition

Published: March 21, 2025 | arXiv ID: 2503.16846v2

By: Qingsong Wang

Potential Business Impact:

Finds hidden patterns in data faster.

Business Areas:
A/B Testing Data and Analytics

Symmetric matrix decomposition is an active research area in machine learning. This paper focuses on exploiting the low-rank structure of non-negative and sparse symmetric matrices via the rectified linear unit (ReLU) activation function. We propose the ReLU-based nonlinear symmetric matrix decomposition (ReLU-NSMD) model, introduce an accelerated alternating partial Bregman (AAPB) method for its solution, and present the algorithm's convergence results. Our algorithm leverages the Bregman proximal gradient framework to overcome the challenge of estimating the global $L$-smooth constant in the classic proximal gradient algorithm. Numerical experiments on synthetic and real datasets validate the effectiveness of our model and algorithm.

Country of Origin
🇨🇳 China

Page Count
12 pages

Category
Computer Science:
Machine Learning (CS)