Open pvardanis opened 1 week ago
Try to use follow
model_def = onnx.helper.make_model(graph_def, producer_name="reshape_node", opset_imports=[helper.make_opsetid("", 18)])
ref https://onnx.ai/onnx/_modules/onnx/helper.html#make_model
@pvardanis Are you using the cpu backend or tensorrt backend in onnxruntime?
@lix19937 I'm using onnxmltools.convert.convert_xgboost
to convert the XGBoost
model to ONNX
, then modifying the graph with onnxgraphsurgeon
. What's your proposed method supposed to do on top of that?
@yuanyao-nv I'm using the default installation, I guess it's the CPU backend?
@lix19937 I'm using
onnxmltools.convert.convert_xgboost
to convert theXGBoost
model toONNX
, then modifying the graph withonnxgraphsurgeon
. What's your proposed method supposed to do on top of that?
onnxruntime 1.15.1 doesn't support onnx 1.14 (opset 18). Use opset_imports
to get onnxruntime 1.16 effect.
I'm using
onnxruntime==1.15.1
&onnx=1.14.0
withonnx-graphsurgeon==0.5.2
.I'm modifying the output of an
onnx
graph of anXGBoost
model using theReshape
&Graph
operators as follows respectively:the model exports successfully, but fails to load using:
and raises the following errors:
Weird thing is, everything works fine with
onnxruntime==1.16.1
. Unfortunately, I'm restricted to use1.15.1
and need to find a workaround for this.