shapr: Prediction Explanation with Dependence-Aware Shapley Values (original) (raw)

Complex machine learning models are often hard to interpret. However, in many situations it is crucial to understand and explain why a model made a specific prediction. Shapley values is the only method for such prediction explanation framework with a solid theoretical foundation. Previously known methods for estimating the Shapley values do, however, assume feature independence. This package implements methods which accounts for any feature dependence, and thereby produces more accurate estimates of the true Shapley values. An accompanying 'Python' wrapper ('shaprpy') is available through the GitHub repository.

Version: 1.0.4
Depends: R (≥ 3.5.0)
Imports: stats, data.table (≥ 1.15.0), Rcpp (≥ 0.12.15), Matrix, future.apply, methods, cli, rlang
LinkingTo: RcppArmadillo, Rcpp
Suggests: ranger, xgboost, mgcv, testthat (≥ 3.0.0), knitr, rmarkdown, roxygen2, ggplot2, gbm, party, partykit, waldo, progressr, future, ggbeeswarm, vdiffr, forecast, torch, GGally, coro, parsnip, recipes, workflows, tune, dials, yardstick, hardhat, rsample
Published: 2025-04-28
DOI: 10.32614/CRAN.package.shapr
Author: Martin Jullum ORCID iD [cre, aut], Lars Henry Berge OlsenORCID iD [aut], Annabelle Redelmeier [aut], Jon Lachmann ORCID iD [aut], Nikolai SellereiteORCID iD [aut], Anders Løland [ctb], Jens Christian Wahl [ctb], Camilla Lingjærde [ctb], Norsk Regnesentral [cph, fnd]
Maintainer: Martin Jullum <Martin.Jullum at nr.no>
BugReports: https://github.com/NorskRegnesentral/shapr/issues
License: MIT + file
URL: https://norskregnesentral.github.io/shapr/,https://github.com/NorskRegnesentral/shapr/
NeedsCompilation: yes
Language: en-US
Citation: shapr citation info
Materials: README NEWS
In views: MachineLearning
CRAN checks: shapr results

Documentation:

Downloads:

Reverse dependencies:

Linking:

Please use the canonical formhttps://CRAN.R-project.org/package=shaprto link to this page.