-
- [ ] Lime
- [ ] Shap
- [ ] Kernel Shap
- [ ] Tree Shap
- [ ] Shapley Values
- [ ] L2X
`TODO`: Add any new if needed.
-
https://github.com/slundberg/shap
-
Hi,
I have a tree explanation object containing ".data", ".values" and ".base_values". If I extract the shapley values from this object for a single data row ("local" value), I get the shapley value…
-
Hey great library!
Why do you not support additional SHAP explainers such as GPUTreeExplainer, DeepExplainer, GradientExplainer or LinearExplainer?
-
Hi, all
I followed the code in test_trees.py. However, I got an error with a loaded pyspark model. But the trained pyspark model works well. The error is:
~~~ bash
AssertionError: The backg…
-
### Issue Description
I used a random forest model with 50 max_depth and 200 n_estimators. When I tried to use shap.TreeExplainer to get the shap value, my terminal reports this bug. I have 3000 data…
-
We use SHAP explanations for models. It will be nice to have explanations for each prediction.
-
### Motivation
SHAP beeswarm plot looks like this:
![Beeswarm-plot-of-SHAP-calculation-for-the-ten-highest-ranking-variables-Variables-are](https://github.com/optuna/optuna/assets/79442793/c0448959-…
-
Due to [renaming of explainers in shap 0.43.0](https://github.com/shap/shap/commit/ca08229b8bb93ae05311532b7cc84f21ea8a8043), TimeSHAP logic fails when the default pip install is used:
```
-------…
-
Hi,
Thank you for this great work. Do you have any plans to make it possible to explain RNN/LSTM models? Right now I have to make the explanation with LIME and then I plot it with force_plot.