Score: 0

$L_1$-norm Regularized Indefinite Kernel Logistic Regression

Published: October 30, 2025 | arXiv ID: 2510.26043v1

By: Shaoxin Wang, Hanjing Yao

Potential Business Impact:

Finds patterns in data better and shows why.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Kernel logistic regression (KLR) is a powerful classification method widely applied across diverse domains. In many real-world scenarios, indefinite kernels capture more domain-specific structural information than positive definite kernels. This paper proposes a novel $L_1$-norm regularized indefinite kernel logistic regression (RIKLR) model, which extends the existing IKLR framework by introducing sparsity via an $L_1$-norm penalty. The introduction of this regularization enhances interpretability and generalization while introducing nonsmoothness and nonconvexity into the optimization landscape. To address these challenges, a theoretically grounded and computationally efficient proximal linearized algorithm is developed. Experimental results on multiple benchmark datasets demonstrate the superior performance of the proposed method in terms of both accuracy and sparsity.

Country of Origin
🇨🇳 China

Page Count
17 pages

Category
Statistics:
Machine Learning (Stat)