tensorflow / decision-forests

A collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models in Keras.
Apache License 2.0
660 stars 110 forks source link

is there any method for local interpretation? #180

Closed jasonmsetiadi closed 1 year ago

jasonmsetiadi commented 1 year ago

I am currently using SHAP explainers, such as TreeExplainer to obtain local interpretation of gradient boosted models (XGBoost, LightGBM, etc). I realize that tfdf offers method for obtaining global interpretation such as permutation importance, but what about local interpretation? I am unable to obtain local interpretation since the tfdf models are not compatible with SHAP's TreeExplainer.

rstz commented 1 year ago

Unfortunately tfdf currently does not support local interpretation methods like SHAP out of the box. If you our anyone else wants to push integrating TF-DF or YDF with SHAP's TreeExplainer, we'd be happy to provide assistance with design and implementation.