Hierarchical Variable Importance with Statistical Control for Medical Data-Based Prediction
By: Joseph Paillard , Antoine Collas , Denis A. Engemann and more
Potential Business Impact:
Finds important brain patterns for diseases.
Recent advances in machine learning have greatly expanded the repertoire of predictive methods for medical imaging. However, the interpretability of complex models remains a challenge, which limits their utility in medical applications. Recently, model-agnostic methods have been proposed to measure conditional variable importance and accommodate complex non-linear models. However, they often lack power when dealing with highly correlated data, a common problem in medical imaging. We introduce Hierarchical-CPI, a model-agnostic variable importance measure that frames the inference problem as the discovery of groups of variables that are jointly predictive of the outcome. By exploring subgroups along a hierarchical tree, it remains computationally tractable, yet also enjoys explicit family-wise error rate control. Moreover, we address the issue of vanishing conditional importance under high correlation with a tree-based importance allocation mechanism. We benchmarked Hierarchical-CPI against state-of-the-art variable importance methods. Its effectiveness is demonstrated in two neuroimaging datasets: classifying dementia diagnoses from MRI data (ADNI dataset) and analyzing the Berger effect on EEG data (TDBRAIN dataset), identifying biologically plausible variables.
Similar Papers
Inference on Local Variable Importance Measures for Heterogeneous Treatment Effects
Methodology
Helps doctors choose best treatments for each person.
Variable Selection Using Relative Importance Rankings
Machine Learning (Stat)
Finds best clues to predict outcomes.
Hierarchical Causal Structure Learning
Methodology
Finds causes in data with different levels.