Hi I'm trying to convert an XGBClassifier to onnx and noticed that if I use a large number of n_estimators and then use early_stopping argument, convert the model to onnx and then load and run the model I get incorrect probabilities using. I've attached an example below
Onnx: [0.00427639 0.5 0.5 0.5 0.5 0.5 ] # doesn't even add up to 1!!
Original: [0.0010141434613615274, 0.0030910135246813297, 0.993025541305542, 0.00016845663776621222, 0.001386264804750681, 0.0013146025594323874]
I found that either turning off removing early_stopping_rounds or making n_estimators smaller ( I think small enough so that the stopping condition doesn't matter) fixes the problem and the ONNX model probabilities are the same as the original model
Hi I'm trying to convert an XGBClassifier to onnx and noticed that if I use a large number of n_estimators and then use early_stopping argument, convert the model to onnx and then load and run the model I get incorrect probabilities using. I've attached an example below
This returns
I found that either turning off removing early_stopping_rounds or making n_estimators smaller ( I think small enough so that the stopping condition doesn't matter) fixes the problem and the ONNX model probabilities are the same as the original model
Is this a bug or am I missing something ?