Score: 1

shapr: Explaining Machine Learning Models with Conditional Shapley Values in R and Python

Published: April 2, 2025 | arXiv ID: 2504.01842v2

By: Martin Jullum , Lars Henry Berge Olsen , Jon Lachmann and more

Potential Business Impact:

Explains why computer predictions are right or wrong.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

This paper introduces the shapr R package, a versatile tool for generating Shapley value based prediction explanations for machine learning and statistical regression models. Moreover, the shaprpy Python library brings the core capabilities of shapr to the Python ecosystem. Shapley values originate from cooperative game theory in the 1950s, but have over the past few years become a widely used method for quantifying how a model's features/covariates contribute to specific prediction outcomes. The shapr package emphasizes conditional Shapley value estimates, providing a comprehensive range of approaches for accurately capturing feature dependencies -- a crucial aspect for correct model explanation, typically lacking in similar software. In addition to regular tabular data, the shapr R package includes specialized functionality for explaining time series forecasts. The package offers a minimal set of user functions with sensible default values for most use cases while providing extensive flexibility for advanced users to fine-tune computations. Additional features include parallelized computations, iterative estimation with convergence detection, and rich visualization tools. shapr also extends its functionality to compute causal and asymmetric Shapley values when causal information is available. Overall, the shapr and shaprpy packages aim to enhance the interpretability of predictive models within a powerful and user-friendly framework.

Repos / Data Links

Page Count
40 pages

Category
Computer Science:
Machine Learning (CS)