Open maaft opened 6 days ago
Could you try https://github.com/microsoft/onnxruntime/blob/yifanl/debug_sym_shape_infer/onnxruntime/python/tools/symbolic_shape_infer.py and see how it works?
Hi @yf711
Now it works for the supplied model.
But for another one, I get this:
Traceback (most recent call last):
File "symbolic_shape_infer.py", line 3041, in <module>
out_mp = SymbolicShapeInference.infer_shapes(
File "symbolic_shape_infer.py", line 2973, in infer_shapes
all_shapes_inferred = symbolic_shape_inference._infer_impl()
File "symbolic_shape_infer.py", line 2737, in _infer_impl
self.dispatcher_[node.op_type](node)
File "symbolic_shape_infer.py", line 2054, in _infer_SplitToSequence
self._infer_Split_Common(node, helper.make_sequence_value_info)
AttributeError: module 'onnx.helper' has no attribute 'make_sequence_value_info'
How about this https://github.com/microsoft/onnxruntime/blob/yifanl/rel-1.19.2-symshapeinfer-reducemean/onnxruntime/python/tools/symbolic_shape_infer.py which is based on your 1.19.2 version. The previous one was based on latest main
Describe the issue
According to TensorRT EP docs one should do symbolic shape inference before executing the model with TensorRT backend.
To do this, symbolic_shape_infer.py is linked in the article.
I found that this gives errors for certain models and don't know why yet.
Error:
To reproduce
python symbolic_shape_infer.py --input model.onnx --output model.onnx
Urgency
Since this is a necessary step to use the TensorRT backend, I feel this could be urgent for some people.
Platform
Linux
OS Version
Ubuntu 22.04
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.19.2
ONNX Runtime API
Python
Architecture
X64
Execution Provider
TensorRT
Execution Provider Library Version
10.5