onnx / onnxmltools

ONNXMLTools enables conversion of models to ONNX
https://onnx.ai
Apache License 2.0
1.01k stars 181 forks source link

[xgboost] Using `onnx==1.14.0` & `onnxruntime==1.15.1` with `target_opset=15` raises NOT_IMPLEMENTED #701

Open pvardanis opened 2 weeks ago

pvardanis commented 2 weeks ago

I've upgraded my library to use onnx==1.14.0 & onnxruntime==1.15.1 with onnxmltools==1.12.0. An exported model that previously worked with onnx==1.12.0 & onnxruntime==1.13.0 now exports successfully but fails to load using an InferenceSession raising the following:

>       sess.initialize_session(providers, provider_options, disabled_optimizers)
E       onnxruntime.capi.onnxruntime_pybind11_state.NotImplemented: [ONNXRuntimeError] : 9 : NOT_IMPLEMENTED : Could not find an implementation for Reshape(19) node with name 'reshape_node'

The model I'm exporting is a simple XGBRegressor:

X, y = diabetes_data
model = XGBRegressor()
model.fit(X, y)

with the same target_opset=15 simply because onnxconverter_common doesn't support higher than this.

Looking at the official onnx docs, onnxruntime==1.15 & onnx==1.14 support up to 19 but this isn't possible. Could that be the cause of the error? image

xadupre commented 2 weeks ago

Can you check the opset written in the onnx model? (with netron for example)