Open AccSrd opened 3 months ago
I don't have an ONNX model to test now, but could you test if saving it first works? Like
for name in nodes_outputs:
model.graph.output.extend([onnx.ValueInfoProto(name=name)])
output_onnx_path = "test.onnx"
onnx.save(model, output_onnx_path, save_as_external_data=True)
ort_session = onnxruntime.InferenceSession(output_onnx_path, providers=['CPUExecutionProvider'])
outputs = [x.name for x in ort_session.get_outputs()]
ort_outs = ort_session.run(outputs, ort_inputs)
I don't have an ONNX model to test now, but could you test if saving it first works? Like
for name in nodes_outputs: model.graph.output.extend([onnx.ValueInfoProto(name=name)]) output_onnx_path = "test.onnx" onnx.save(model, output_onnx_path, save_as_external_data=True) ort_session = onnxruntime.InferenceSession(output_onnx_path, providers=['CPUExecutionProvider']) outputs = [x.name for x in ort_session.get_outputs()] ort_outs = ort_session.run(outputs, ort_inputs)
Thanks for your kindly reply! I'll check it soon and give you feedback :)
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
Describe the issue
The output of the intermediate node cannot be obtained in
onnxruntime.InferenceSession
for the ONNX model whose size exceeds 2 GB. Currently, whenonnxruntime
python package is used, if you want to obtain the output of the middle layer node of a ONNX model, you must usemodel.graph.output.extend([onnx.ValueInfoProto(name=name)])
to specify the node name afteronnx.load(path)
, and then initializeInferenceSession
. However, when initializingInferenceSession
by loading the onnx model class, you need to pass the result ofmodel.SerializeToString()
, which does not support Onnx models larger than 2GB :(To reproduce
Normally, we use following codes to obtain the output of the intermediate node:
However, if the onnx model is larger than 2GB, the
SerializeToString()
will raise ERROR:Is there a possible solution to this awkward situation? Thank you very much.
Urgency
No response
Platform
Linux
OS Version
Ubuntu 9.4.0-1ubuntu1~20.04.2
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.18.1
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response