-
`import shap
explainer = shap.KernelExplainer(linear_lr.predict_proba, X_train)
shap_values = explainer.shap_values(X_train)
shap.initjs()
shap.summary_plot(shap_values[0], features=X_train, color…
-
Hi
1) how can I use **shap.force_plot** with a current definition of explainer in order to obtain the following figure? The example below illustrates this use but cannot work correctly. How can I co…
-
I use the exact same data, y has 2 classes, the output of **lightGBM and RF** are **lists**, and the list contains a two-dimensional matrix with the same dimension as the data, but the output of **xgb…
-
### Issue Description
For an XGBoost model, calculated SHAP values with `feature_perturbation='interventional'` and `model_output='log_loss'` differ greatly.
### Minimal Reproducible Example
```py…
-
I am trying to get shap_interaction values through XGBoost, and I wonder if interventional feature pertubation is not supported but it is still silently giving the zero matrix? Looking at the shap cod…
-
According to the README, it should be possible to use code like the following to calculate shapley values and create a waterfall plot using a `sklearn` classifier:
```
import shap
import sklearn
…
-
Hi,
I am trying to solve a binary classification problem with the dataset that contains 100 data rows with 40 feature columns. I applied Random forest on the given dataset. After applying the RF, I…
-
**Problem Description**
Currently, it is not possible to store/retrieve the SHAP Values for individual features before they are eliminated to give a reduced feature set. This limits the analysis of S…
-
### Issue Description
When you explain RandomForest models (and sometimes even GradientBoostingTrees with Sklearn) using TreeSHAP explainer, tree-shap script breaks.
I was assuming that this wa…
-
Hi, I was giving the DeepExplainer a try for estimating the shap interaction value, but received the following error message:
AttributeError: 'DeepExplainer' object has no attribute 'shap_interacti…