-
https://www.kaggle.com/code/subinium/interpretable-cnn-with-shap-mnist/notebook
-
### Issue Description
AssertionError: The SHAP explanations do not sum up to the model's output! This is either because of a rounding error or because an operator in your computation graph was not fu…
-
Hi there,
is there any elaborated way to obtain SHAP Feature Importance using shapper?
Reading this https://christophm.github.io/interpretable-ml-book/shap.html#shap-feature-importance
...I w…
-
```python
import statsmodels.api as sm
from scipy.stats import f_oneway
def fit_spline_to_shap(feature_values, shap_values, knots):
if knots > 0:
# Spline fitting
knot_positions =…
-
**Problem Description**
Currently, it is not possible to store/retrieve the SHAP Values for individual features before they are eliminated to give a reduced feature set. This limits the analysis of S…
-
Hey @slundberg,
I'm trying to get SHAP added to a conda-forge recipe, and all dependant packages must be on conda as well. The [issue is with iml](https://circleci.com/gh/conda-forge/staged-recipe…
-
The current CCAO feature evaluation process for new model features is very ad-hoc. We typically look at the change in model performance metrics before and after the addition of a new feature, as well …
-
I've been trying to use pickle to save the explainer, how would I save the Kernel Explainer?
```
samples = shap.sample(x_test, 100)
training = shap.sample(x_train, 10000)
explainer = shap.KernelEx…
-
### Issue Description
When finding shap values for the deep learning model using Tensorflow, there is some operation lookup error for following operations:
1. TensorListStack
2. BatchMatMulV2 - Tra…
-
### Issue Description
`DeepExplainer` currently seems unable to handle a Pytorch model loaded from TorchScript, and will throw `RuntimeError: register_forward_hook is not supported on ScriptModules`.…