Covariance-Driven Regression Trees: Reducing Overfitting in CART
By: Likun Zhang, Wei Ma
Potential Business Impact:
Makes computer predictions more accurate and reliable.
Decision trees are powerful machine learning algorithms, widely used in fields such as economics and medicine for their simplicity and interpretability. However, decision trees such as CART are prone to overfitting, especially when grown deep or the sample size is small. Conventional methods to reduce overfitting include pre-pruning and post-pruning, which constrain the growth of uninformative branches. In this paper, we propose a complementary approach by introducing a covariance-driven splitting criterion for regression trees (CovRT). This method is more robust to overfitting than the empirical risk minimization criterion used in CART, as it produces more balanced and stable splits and more effectively identifies covariates with true signals. We establish an oracle inequality of CovRT and prove that its predictive accuracy is comparable to that of CART in high-dimensional settings. We find that CovRT achieves superior prediction accuracy compared to CART in both simulations and real-world tasks.
Similar Papers
Conditional Copula models using loss-based Bayesian Additive Regression Trees
Methodology
Shows how things are connected, even when they change.
Soft regression trees: a model variant and a decomposition training algorithm
Machine Learning (CS)
Makes computer predictions faster and more accurate.
The Honest Truth About Causal Trees: Accuracy Limits for Heterogeneous Treatment Effect Estimation
Statistics Theory
Makes smart computer guesses about what works better.