Score: 0

Scaling Up ROC-Optimizing Support Vector Machines

Published: November 7, 2025 | arXiv ID: 2511.04979v1

By: Gimun Bae, Seung Jun Shin

Potential Business Impact:

Makes smart programs learn faster and better.

Business Areas:
A/B Testing Data and Analytics

The ROC-SVM, originally proposed by Rakotomamonjy, directly maximizes the area under the ROC curve (AUC) and has become an attractive alternative of the conventional binary classification under the presence of class imbalance. However, its practical use is limited by high computational cost, as training involves evaluating all $O(n^2)$. To overcome this limitation, we develop a scalable variant of the ROC-SVM that leverages incomplete U-statistics, thereby substantially reducing computational complexity. We further extend the framework to nonlinear classification through a low-rank kernel approximation, enabling efficient training in reproducing kernel Hilbert spaces. Theoretical analysis establishes an error bound that justifies the proposed approximation, and empirical results on both synthetic and real datasets demonstrate that the proposed method achieves comparable AUC performance to the original ROC-SVM with drastically reduced training time.

Page Count
15 pages

Category
Computer Science:
Machine Learning (CS)