Closed manningkyle304 closed 1 year ago
You can compute them yourself by something similar to this:
def get_shap_values(model, X):
if isinstance(model, XGBDistribution):
model = model.get_booster()
explainer = shap.TreeExplainer(model)
shap_values = explainer(X)
return shap_values
The xgboost native way is to access the feature_importances_
attribute post fitting, see here in the xgboost docs. Note that this will average over the params of a given distribution.
@CDonnerer is there a way to get the specific feature importances for the separate parameters though?
It does not appear to be natively supported, but you could potentially try using SHAP values (see also this issue)
In NGBoost, a nice thing is the ability to get feature importances for distribution parameters - see the example below from the NGBoost docs, https://stanfordmlgroup.github.io/ngboost/3-interpretation.html
However, it doesn't seem this is available in xgboost-distribution or at least not by this method.
Is there a way to get feature importances? or any plan to add it?