-
This is related to #580 where people experienced a reshape error when passing a matrix with more or less features to the `explainer.shap_values` than the number of features used to train a tree model.…
-
Hi,
Is there an API to get shap values for a model?
OR
IS there any support in aws lambda for shap ?
It would be really helpful for me.
Please help.
-
When I run the **SHAP with AutoGluon-Tabular Census income classification.ipynb** in the [existing solution of SHAP for AutoGluon](https://github.com/awslabs/autogluon/tree/master/examples/tabular/int…
-
When I ran the example code from below link (using shap version 0.37.0)
https://github.com/slundberg/shap/blob/master/notebooks/text_examples/sentiment_analysis/text_to_multiclass_explanations/Emo…
-
### Issue Description
The following code produces `UserWarning: unrecognized nn.Module: Flatten`:
### Minimal Reproducible Example
```python
import torch
from torch import nn, Tensor
from shap im…
-
Hi,
I was looking at SHAP interaction values, and this paper:
[Consistent Individualized Feature Attribution for Tree Ensembles](https://arxiv.org/pdf/1802.03888.pdf )
I was hoping someone co…
vla6 updated
4 months ago
-
There is a recent paper which explains how to do explain_prediction for trees and tree ensembles, which they claim to be better than treeinterpreter-like measures: https://arxiv.org/pdf/1706.06060.pdf…
kmike updated
5 years ago
-
@kmike @lopuhin Hi eli5 community,
I'm interested in working on the eli5 project. Specifically on the task of adding the SHAP support.
How should I start?
-
Enhancing the training set with additional features does not guarantee better
performance. Some features might not carry any valuable information and thus
only contribute as noise. In future work, we …
-
### Issue Description
This is an follow up to #3187. In #3187 I reported on inconsistencies in the returned SHAP explainer object depending on the explanation task (e.g. classification bin/multi or r…