An RKHS Perspective on Tree Ensembles
By: Mehdi Dagdoug, Clement Dombry, Jean-Jil Duchamps
Potential Business Impact:
Explains why computer learning models work so well.
Random Forests and Gradient Boosting are among the most effective algorithms for supervised learning on tabular data. Both belong to the class of tree-based ensemble methods, where predictions are obtained by aggregating many randomized regression trees. In this paper, we develop a theoretical framework for analyzing such methods through Reproducing Kernel Hilbert Spaces (RKHSs) constructed on tree ensembles -- more precisely, on the random partitions generated by randomized regression trees. We establish fundamental analytical properties of the resulting Random Forest kernel, including boundedness, continuity, and universality, and show that a Random Forest predictor can be characterized as the unique minimizer of a penalized empirical risk functional in this RKHS, providing a variational interpretation of ensemble learning. We further extend this perspective to the continuous-time formulation of Gradient Boosting introduced by Dombry and Duchamps, and demonstrate that it corresponds to a gradient flow on a Hilbert manifold induced by the Random Forest RKHS. A key feature of this framework is that both the kernel and the RKHS geometry are data-dependent, offering a theoretical explanation for the strong empirical performance of tree-based ensembles. Finally, we illustrate the practical potential of this approach by introducing a kernel principal component analysis built on the Random Forest kernel, which enhances the interpretability of ensemble models, as well as GVI, a new geometric variable importance criterion.
Similar Papers
Unbiased Stochastic Optimization for Gaussian Processes on Finite Dimensional RKHS
Machine Learning (CS)
Makes computer learning faster and more accurate.
Transfer Learning Across Fixed-Income Product Classes
Machine Learning (Stat)
Helps predict money values more accurately.
Ridge Boosting is Both Robust and Efficient
Methodology
Makes one computer model work well for many jobs.