Open Columbine21 opened 4 years ago
same errors, the problem lies in :
def _prediction_feature_weights(booster, dmatrix, n_targets,
feature_names, xgb_feature_names):
""" For each target, return score and numpy array with feature weights
on this prediction, following an idea from
http://blog.datadive.net/interpreting-random-forests/
"""
# XGBClassifier does not have pred_leaf argument, so use booster
leaf_ids, = booster.predict(dmatrix, pred_leaf=True)
booster.predict(dmatrix, pred_leaf=True)
will return a 1-d array when only one tree in xgboost
.
this may be modified to:
def _prediction_feature_weights(booster, dmatrix, n_targets,
feature_names, xgb_feature_names):
""" For each target, return score and numpy array with feature weights
on this prediction, following an idea from
http://blog.datadive.net/interpreting-random-forests/
"""
# XGBClassifier does not have pred_leaf argument, so use booster
leaf_ids, = booster.predict(dmatrix, pred_leaf=True).reshape(1,-1)
Now the question is: Is dmatrix
only contains one sample ?
Description: XGBClassifier explainer failed when n_estimators=1.(extreme situation) E.g. model_t = XGBClassifier(random_state=1111, max_depth=4, n_estimators=1)
show_prediction(model, test_input[tougue_correct_q[0]]) will failed to run.
The error message is as follows: