Recent advances in interpretability have produced SHAP values for assigning score contributions to features in a consistent manner. Calculating per-feature score contributions was added to XGBoost and LightGBM. It would be great to have it in gbm3 as well:
Recent advances in interpretability have produced SHAP values for assigning score contributions to features in a consistent manner. Calculating per-feature score contributions was added to XGBoost and LightGBM. It would be great to have it in gbm3 as well:
https://github.com/slundberg/shap