The error message I get is below. I have tried loading the pandas df as a numpy array instead, and I have tried reverting back to an earlier version of eli5, but nothing seems to fix this. Tearing my hair about a bit. Any ideas? Thanks kindly
Here is the error:
TypeError: object of type 'numpy.int32' has no len()
`TypeError Traceback (most recent call last)
in
4 # for i in range(len(features_pred)):
5
----> 6 expl = eli5.explain_prediction_xgboost(model, features_pred.iloc[0,:])
7
8 #expl = eli5.explain_prediction_xgboost(booster, features_pred.iloc[0])
/databricks/python/lib/python3.5/site-packages/eli5/xgboost.py in explain_prediction_xgboost(xgb, doc, vec, top, top_targets, target_names, targets, feature_names, feature_re, feature_filter, vectorized, is_regression, missing)
194
195 scores_weights = _prediction_feature_weights(
--> 196 booster, dmatrix, n_targets, feature_names, xgb_feature_names)
197
198 x = get_X0(add_intercept(X))
/databricks/python/lib/python3.5/site-packages/eli5/xgboost.py in _prediction_feature_weights(booster, dmatrix, n_targets, feature_names, xgb_feature_names)
244 xgb_feature_names = {f: i for i, f in enumerate(xgb_feature_names)}
245 tree_dumps = booster.get_dump(with_stats=True)
--> 246 assert len(tree_dumps) == len(leaf_ids)
247
248 target_feature_weights = partial(
TypeError: object of type 'numpy.int32' has no len()`
Hi,
I'm trying to run the predictions method on a pandas dataframe.
I trained the model and have loaded new data with which to make predictions on.
My model is of type:
xgboost.core.Booster
My statement is as follows:
expl = eli5.explain_prediction_xgboost(model, features_pred.iloc[0,:])
The error message I get is below. I have tried loading the pandas df as a numpy array instead, and I have tried reverting back to an earlier version of eli5, but nothing seems to fix this. Tearing my hair about a bit. Any ideas? Thanks kindly
Here is the error:
TypeError: object of type 'numpy.int32' has no len()
`TypeError Traceback (most recent call last)