linkedin / FastTreeSHAP

Fast SHAP value computation for interpreting tree-based models
BSD 2-Clause "Simplified" License
500 stars 30 forks source link

Additivity check fails with XGBoost #31

Open ThomasBury opened 11 months ago

ThomasBury commented 11 months ago

Hi,

when using XGB, the additivity check fails, while it does not when using native shap or lightgbm Using xgboost 1.7.6, shap 0.41.0 and fasttreeshap 0.1.6

could it be linked to https://github.com/linkedin/FastTreeSHAP/issues/15 ?

example (changing algorithm or feature_perturbation lead to the same error):

from sklearn.datasets import make_regression
from xgboost import XGBRegressor
from lightgbm import LGBMRegressor
from fasttreeshap import TreeExplainer as FastTreeExplainer

X, y = make_regression(n_samples=1000, n_features=10, n_informative=8, noise=1, random_state=8)
model = XGBRegressor()  #LGBMRegressor()
model.fit(X, y)
explainer = FastTreeExplainer(model, algorithm="auto", shortcut=False, feature_perturbation="tree_path_dependent")
shap_matrix = explainer.shap_values(X)
Exception: Additivity check failed in TreeExplainer! Please ensure [...]

The following runs fine

explainer = shap.TreeExplainer(model, feature_perturbation="tree_path_dependent")
shap_values = explainer.shap_values(X)

Thanks