Score: 0

Ranked Set Sampling-Based Multilayer Perceptron: Improving Generalization via Variance-Based Bounds

Published: July 11, 2025 | arXiv ID: 2507.08465v2

By: Feijiang Li , Liuya Zhang , Jieting Wang and more

Potential Business Impact:

Makes computer learning more accurate by organizing data.

Business Areas:
A/B Testing Data and Analytics

Multilayer perceptron (MLP), one of the most fundamental neural networks, is extensively utilized for classification and regression tasks. In this paper, we establish a new generalization error bound, which reveals how the variance of empirical loss influences the generalization ability of the learning model. Inspired by this learning bound, we advocate to reduce the variance of empirical loss to enhance the ability of MLP. As is well-known, bagging is a popular ensemble method to realize variance reduction. However, bagging produces the base training data sets by the Simple Random Sampling (SRS) method, which exhibits a high degree of randomness. To handle this issue, we introduce an ordered structure in the training data set by Rank Set Sampling (RSS) to further reduce the variance of loss and develop a RSS-MLP method. Theoretical results show that the variance of empirical exponential loss and the logistic loss estimated by RSS are smaller than those estimated by SRS, respectively. To validate the performance of RSS-MLP, we conduct comparison experiments on twelve benchmark data sets in terms of the two convex loss functions under two fusion methods. Extensive experimental results and analysis illustrate the effectiveness and rationality of the propose method.

Country of Origin
🇨🇳 China

Page Count
21 pages

Category
Computer Science:
Machine Learning (CS)