-
Hi @imoscovitz
Right now, I'm training IREP or RIPPER on up to 2 or 3k features.
In the end, the generated rules tend to use only 30 features max.
In my case, doing a such amount of features eng…
-
I am extending all chapters with a section for software implementations and to alternative algorithms (also with software implementation). The software can be any free and open source software: R, Pyt…
-
Using:
`explainer = shap.TreeExplainer(xgb_model, test_yhat[X], feature_dependence='independent', model_output='logloss')`
I am able to calculate shap contributions to the logloss (or something…
-
Problem: {fstr_type = "FeatureImportance" and fstr_type = "ShapValues" are identical}
catboost version: {catboost_0.9.1.1}
Operating System: { x86_64-redhat-linux-gnu (64-bit)}
Dear Catboost te…
-
Follow same outline here but for multinomial prediction: https://github.com/navdeep-G/interpretable-ml/blob/master/notebooks/credit/xgb_credit_binary_classifier.ipynb
* [x] Need multinomial XGBoost m…
-
I'm not entirely sure, if this idea is within the scope of the project, but I thought I'd propose it anyway:
When there are multiple workshops or talks offered, but only limited spots available for e…
-
* Since this theme is to be release soon, maybe after WordPress 4.7. From this version Custom CSS is added to the core, your options in `Shapley Options=> Other=> Custom CSS` needs to be removed.
-
Hello,
I wanted to get some opinions on using feature groups in tree SHAP (just like in Kernel SHAP) in order to group dummy features that belong together. I wonder if this is actually useful? if i…
-
Are we interested in incorporating our own measures for local vip (i.e. Shapley, LIME)?
-
I gave it a try on [Stackoverflow ](https://stackoverflow.com/questions/51099368/passing-parameters-to-the-predict-function-in-mlr-for-xgboost) and the suggestion was to better try here:
The latest…
notiv updated
5 years ago