Score: 0

Covariance-Driven Regression Trees: Reducing Overfitting in CART

Published: January 12, 2026 | arXiv ID: 2601.07281v1

By: Likun Zhang, Wei Ma

Potential Business Impact:

Makes computer predictions more accurate and reliable.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Decision trees are powerful machine learning algorithms, widely used in fields such as economics and medicine for their simplicity and interpretability. However, decision trees such as CART are prone to overfitting, especially when grown deep or the sample size is small. Conventional methods to reduce overfitting include pre-pruning and post-pruning, which constrain the growth of uninformative branches. In this paper, we propose a complementary approach by introducing a covariance-driven splitting criterion for regression trees (CovRT). This method is more robust to overfitting than the empirical risk minimization criterion used in CART, as it produces more balanced and stable splits and more effectively identifies covariates with true signals. We establish an oracle inequality of CovRT and prove that its predictive accuracy is comparable to that of CART in high-dimensional settings. We find that CovRT achieves superior prediction accuracy compared to CART in both simulations and real-world tasks.

Country of Origin
🇨🇳 China

Page Count
40 pages

Category
Statistics:
Machine Learning (Stat)