A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
Scott Lundberg has added SHAP interactions into his TreeSHAP implementation in XGBoost, but not in LightGBM. This option is very helpful in opening the "black box". I'd love to see SHAP interaction values also in LightGBM.
There are a couple of TreeSHAP experts around @hbaniecki that might be willing to implement it directly into LightGBM.
Description
SHAP interactions decompose raw predictions into the sum of contributions from all feature pairs. There is an implementation in C++ in XGBoost (by Scott, I think), and there is the R package {treeshap} https://github.com/ModelOriented/treeshap with another implementation in C++ that also works for LightGBM.
References
Lundberg, S.M., Erion, G., Chen, H. et al. From local explanations to global understanding with explainable AI for trees. Nat Mach Intell 2, 56–67 (2020). https://doi.org/10.1038/s42256-019-0138-9
Summary
Scott Lundberg has added SHAP interactions into his TreeSHAP implementation in XGBoost, but not in LightGBM. This option is very helpful in opening the "black box". I'd love to see SHAP interaction values also in LightGBM.
There are a couple of TreeSHAP experts around @hbaniecki that might be willing to implement it directly into LightGBM.
Description
SHAP interactions decompose raw predictions into the sum of contributions from all feature pairs. There is an implementation in C++ in XGBoost (by Scott, I think), and there is the R package {treeshap} https://github.com/ModelOriented/treeshap with another implementation in C++ that also works for LightGBM.
References
Lundberg, S.M., Erion, G., Chen, H. et al. From local explanations to global understanding with explainable AI for trees. Nat Mach Intell 2, 56–67 (2020). https://doi.org/10.1038/s42256-019-0138-9