Hadamard-Riemannian Optimization for Margin-Variance Ensemble
By: Zexu Jin
Potential Business Impact:
Makes computer predictions more accurate and reliable.
Ensemble learning has been widely recognized as a pivotal technique for boosting predictive performance by combining multiple base models. Nevertheless, conventional margin-based ensemble methods predominantly focus on maximizing the expected margin while neglecting the critical role of margin variance, which inherently restricts the generalization capability of the model and heightens its vulnerability to overfitting, particularly in noisy or imbalanced datasets. Additionally, the conventional approach of optimizing ensemble weights within the probability simplex often introduces computational inefficiency and scalability challenges, complicating its application to large-scale problems. To tackle these limitations, this paper introduces a novel ensemble learning framework that explicitly incorporates margin variance into the loss function. Our method jointly optimizes the negative expected margin and its variance, leading to enhanced robustness and improved generalization performance. Moreover, by reparameterizing the ensemble weights onto the unit sphere, we substantially simplify the optimization process and improve computational efficiency. Extensive experiments conducted on multiple benchmark datasets demonstrate that the proposed approach consistently outperforms traditional margin-based ensemble techniques, underscoring its effectiveness and practical utility.
Similar Papers
Option Pricing Using Ensemble Learning
Machine Learning (CS)
Makes computer stock predictions more accurate.
Mitigating loss of variance in ensemble data assimilation: machine learning-based and distance-free localization
Machine Learning (CS)
Improves computer weather forecasts by reducing errors.
A Cooperative Game-Based Multi-Criteria Weighted Ensemble Approach for Multi-Class Classification
Machine Learning (CS)
Makes AI smarter by combining different "brains."