-
For debugging black box models, it would be nice to get shapley feature importance values as they relate to the loss of the model rather than the prediction. I've seen this implemeted by the original …
-
Hi i've been really enjoying the work you guys have made.
But is there a way a could use XAI methods for GRANDE like SHAP values for the interpretability?
i've been working out with the kernel sha…
-
Hello, I have an issue with my shap values, here is my model:
```
Model: "model_3"
__________________________________________________________________________________________________
Layer (type) …
-
I've noticed that using pred_contribs to generate shap values takes significantly more gpu memory in XGBoost 2.1.1 vs 1.4.2.
This can lead to having issues with generating shap values, where no issu…
-
Opening this issue to collect and plan around the upcoming diagnostics package.
After #355 is merged, the last fitted surrogate model will be available. Since we deal with bayesian models our model…
-
Hi, I am using fasttreeshap v0.1.6 package to calculate the shap values of my random forest model. It really accelerate the shap value calculation substantially. However, when I tried to produce the s…
-
I am trying to use GPUTree explainer, however, the explainer gives me error. See below my codes:
```
explainer = shap.explainers.GPUTree(rf_model)
shap_values = explainer(X_test)
```
The last…
-
When retrieving `full_shap` values, debugger returns a matrix in the shape of `number of training samples, number of features` e.g.
```
for index,i in enumerate(trial.tensor_names(regex='full_shap')…
-
### Issue Description
when using barplot function from shap a TypeError occurs when calling line 260 ax.set_yticks.
set_ticks has no attribute fontsize.
### Minimal Reproducible Example
…
-
### Issue Description
Hello,
I am learning how to use shap to explain **llama3.2**; however, my program reported an error. My environment works fine with the multi-class example in the docs w/ BERT,…