-
I dumped the `adult_dataset` that you mention in ReadMe into a csv and run a [RandomForestClassifier](https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html) wi…
-
Hi there,
Thank you for developing this wonderful package.
There is a method of estimating variable importance implemented by [LeBreton and Tonidandel (2014)](https://link.springer.com/article…
-
Hi,
I have just started learning XGBoost model. I used SHAP as a tool for feature selection for my XGBoost prediction model. After obtaining the feature importance, I noticed that the SHAP values o…
-
Add the script for the Gale-Shapley Stable Matching Algorithm.
-
Hi, as far as I understand, Datascope is compatible with any scikit-learn pipeline. I'm using PyTorch and skorch (library that wraps PyTorch) to make my classifier scikit-learn compatible.
I'm curr…
-
`UnivariateFinite` can be overkill for e.g. simple `Bool` categories.
It looks like they could be at least 50x faster to construct using a custom struct without the `LittleDict`.
Adding a `Abst…
-
**Features**
- [x] Input feature list (hopefully informed by @navdeep-G using GBM Shapley)
**For simulated data**
- [x] Unconstrained feedforward ANN trained w/ 5-fold CV with training/CV and tes…
-
## ❓ Questions and Help
Hello, I have just set up to use captum to analyze a resnet50 model (taken directly from torchvision). Captum works perfectly as expected for most of the attribution methods…
-
Hi! I fit a multiclass classification XGBoost model and try to generate the plot of shapley values using shap.prep and then shap.plot.summary. However, I'm getting the following error when using shap.…
-
TLDR: I couldn't make DeepExplainer show the correlation between input and output when using a softmax, the plots are below, the code is [here](https://github.com/ydib/shap_softmax_problem/blob/main/s…