-
The points below are the things we should complete before the final project is due. This doesn't include anything that we should all contribute, like expanding on the report and creating the presentat…
-
Outcomes
- Compare open-source versus managed, cloud-native solutions
- Compare multiple patterns for decoupling backend services
- Compare servers vs containers vs serverless computing
- Assess trad…
-
Hi all, I am working on a binary classification problem, using xgboost.
Once I trained my model, on X_train, I do the following line to get the SHAP model "trained" as well :
`tree_shap_explainer …
-
Hi, I am trying to do the cox survival model’s interpretability with Shap, so I want to mimic what was done for the nhanesi dataset provided by the official doc.
![image](https://user-images.githubus…
-
- Could you please comment on feature importance. What did you learn from them? Were they all needed? How do you establish feature importance? How is the subsequent work affected by this?
- Have you …
-
It would be nice to have some integrated tools for model inspection and feature importance (FI). Below are some links to resources and what's available in scikit learn.
Scikit learn exposes a numbe…
-
Hi,
When I ran **shap_arr = explainer.shap_values(test_X)**, I got the error message:
RuntimeError: The size of tensor a (6998) must match the size of tensor b (3499) at non-singleton dimension 1…
-
**3. Project proposal: reasoning**
Comments
What sort of EDA will you do? What types of plots? Why? Any hypothesizes?
What about class balance? There could be an imbalance in the classes, in which …
-
Hi I'm currently trying to use spvim() from vimpy for its ability to accomodate arbitrary prediction functions as oppose to sp_vim() in R where as far as I can see only learners from the SL library ca…
-
After CUML random forest is fitted, one can use the explainer to get Shapley:
https://docs.rapids.ai/api/cuml/stable/api.html#cuml.explainer.PermutationExplainer
As:
```
from cuml.ex…