ModelOriented / shapper

An R wrapper of SHAP python library
https://modeloriented.github.io/shapper/
58 stars 16 forks source link

SHAP Feature Importance #25

Open Steviey opened 4 years ago

Steviey commented 4 years ago

Hi there,

is there any elaborated way to obtain SHAP Feature Importance using shapper?

Reading this https://christophm.github.io/interpretable-ml-book/shap.html#shap-feature-importance

...I would guess, doing a loop over "shapper::individual_variable_effect" and mean() the results of attributions per vname could do the trick.

Am I wrong?

Is there any plan to integrate the original functions, like summary_plot to obtain SHAP feature importance?

By the way, when I try to feed the function individual_variable_effect with multiple new observations new_observation = testX[1:5, ] I get errors.

Error in$<-.data.frame(tmp, "_attribution_", value = c(0, -0.365675633989662, : replacement has 140 rows, data has 70

stereolith commented 4 years ago

Hello, I needed just this functionality for a university project and implemented it here: #26 . Additionally, to cope with larger data sets, I implemented the kmeans function of the SHAP Python lib to help summarize data instances.