Open nkinnaird opened 7 months ago
I'm hesitating between a runtime issue and a converting issue. Most probably a converting issue. Did you try with the python runtime https://onnx.ai/onnx/api/reference.html to see if you get the same results?
Apologies - was on vacation and getting back to this now.
Predicting with ReferenceEvaluator(...).run gives me the same results, so I'm thinking it's probably a converting issue. I'm going to try again and see if changing various sklearn or skl2onnx versions resolves the proglem, maybe I have a misalignment somewhere throwing things off.
I have an Sklearn Decision Tree Classifier model which I've converted to Onnx and am running within python. When I get the probabilities of the predictions via the Sklearn model, those probabilities can have floating point values between 0 and 1. When I get the probabilities of the predictions via the converted Onnx model however, any non-zero probability gets returned as 1.0. The predicted classes appear to be consistent between the two models, but the predicted probabilities are not. This seems super odd and I thought for a while that I had some data type issue, but I haven't been able to figure it out and so I'm posting here.
Relevant code is below:
Packages pip installed to run this include:
The model file types are unsupported for upload, otherwise I would attach them. I can provide them if needed via Google Drive if someone wants to take a look.
I'm hoping that there is something simple I'm missing, but I'm not sure if that issue could be with the decision tree model itself or in some way that I've created the onnx model.
Small edit: In the vast majority of cases the predicted probabilities are 0 and 1 from the decision tree. It is the rare cases where the predictions are some float in-between 0 and 1 that the Sklearn and Onnx models diverge - hence the hardcoded input vector I pasted above.