-
We keep having serialization issues (missing Serializable) with classes on h2o-genmodel, the basic use cases are covered, however, more advanced are oftentimes missed (eg. Shapley contributions using …
-
### Issue Description
I am trying to install shap with CUDA support on a Windows 10 Enterprise (64bit) machine. I cloned the shap repository version 0.45.1 and built SHAP using `python -m pip install…
-
Hi all,
I trained the lightgbm model and run use the `TreeExplainer `function to calculate the shapley scores but I notice that the expected value comes from `explainer_shap = shap.TreeExplainer(mo…
-
This might be a trivial question, but I am asking this for my own clarity. In Equation 6.5, the variable-importance measure associated with the breakDown analysis is given as $v(j,\underline{x}_*)$. I…
-
1. Shapley value https://en.wikipedia.org/wiki/Shapley_value
solution for defining contribution to the group.
https://www.youtube.com/watch?v=MHS-htjGgSY
2. If a member of coalition contributes …
-
Hi ! Thanks for your open source, but I'm a little confused about your code.
https://github.com/Euphoria16/Shapley-NAS/blob/06497276e819a38af997415a5015900aa3e93667/train_search.py#L252
Why `minus…
-
Issue relates to printed book and electronic version 2020-02-01
Section 16.8.1, Algorithm 7: https://bradleyboehmke.github.io/HOML/iml.html#shapley-values
```
| b1 = x, but all the featur…
-
Hi,
I was looking at SHAP interaction values, and this paper:
[Consistent Individualized Feature Attribution for Tree Ensembles](https://arxiv.org/pdf/1802.03888.pdf )
I was hoping someone co…
vla6 updated
9 months ago
-
Test subgraphx example:
explainer = SubgraphX(grace, num_classes=4, device=device,
explain_graph=False, reward_method='nc_mc_l_shapley')
then get this error
TypeError …
-
Hi there, thanks for your great work. When I run `1_surrogate_sanity_check.ipynb`, when turning to the 5-th cell:
```python
def generate_mask(num_players: int, num_mask_samples: int or None = None…