Open wangtianlong1994 opened 3 years ago
Can you try to run onnx simplifier in cli manually rather than run main.py?
Can you try to run onnx simplifier in cli manually rather than run main.py?
of course,I use the following command in cli to generate a new onnx model. How can I serialize and save this model for C + + reasoning?
python -m onnxsim yolov5_1.onnx outputmodel.onnx
Simplifying...
Checking 0/3...
Checking 1/3...
Checking 2/3... Ok!
I don't know how to use the new onnx model。
Thank you for your reply。
the outputmodel.onnx is equivalent to the output in https://github.com/TrojanXu/yolov5-tensorrt/blob/master/main.py#L368. so just feed it into the next step after line 368.
outputmodel.onnx等效于https://github.com/TrojanXu/yolov5-tensorrt/blob/master/main.py#L368中的输出。因此,只需将其输入到第368行之后的下一步即可。
When I annotate # simplify_onnx(onnx_path) and set onnx_path = "outputmodel.onnx"
New errors will be received
ERROR: Failed to parse the ONNX file. In node -1 (parseGraph): UNSUPPORTED_NODE: Assertion failed: convertOnnxWeights(initializer, &weights, ctx)
I found that I couldn't parse the onnx model, but I could use netron to open the onnx model。
# Load the Onnx model and parse it in order to populate the TensorRT network. with open(onnx_path, 'rb') as model: if not parser.parse(model.read()): print ('ERROR: Failed to parse the ONNX file.')
ERROR: Failed to parse the ONNX file. In node -1 (parseGraph): UNSUPPORTED_NODE: Assertion failed: convertOnnxWeights(initializer, &weights, ctx) i have same error,
Hello, I am experiencing the exact same error. I am using the correct versions (torch 1.4, onnx 1.6, trt 7). I am seeing the Int64 warnings, which can be ignored by now and after that i get:
ERROR: Failed to parse the ONNX file.
In node -1 (parseGraph): UNSUPPORTED_NODE: Assertion failed: convertOnnxWeights(initializer, &weights, ctx)
Traceback (most recent call last):
File "main.py", line 372, in <module>
trt_result = profile_trt(build_engine(onnx_path, using_half), batch_size, 10, 100)
File "main.py", line 275, in profile_trt
assert(engine is not None)
AssertionError
any ideas? Thanks!
Edit: Just read a comment in another issue, which brought to me a thought:
Do I have to retrain with your new config with custom Upsample? Is this the Node that fails to parse? This would make sense, but is not clear from the README at all. I will try to retrain this way and report back. If thats the issue, the README should be updated to make it more clear, that we need to retrain again!
Greetings, Maik
Hello again,
even after retraining from scratch with the custom Upsample, the same error occurs. Would be nice to hear some feedback!
Thanks
@mfruhner Facing the same issue. Were you able to make any progress on this one?
@mfruhner Facing the same issue. Were you able to make any progress on this one?
Facing the same issue. Were you able to make any progress on this one?
outputmodel.onnx等效于https://github.com/TrojanXu/yolov5-tensorrt/blob/master/main.py#L368中的输出。因此,只需将其输入到第368行之后的下一步即可。
When I annotate # simplify_onnx(onnx_path) and set onnx_path = "outputmodel.onnx"
New errors will be received
ERROR: Failed to parse the ONNX file. In node -1 (parseGraph): UNSUPPORTED_NODE: Assertion failed: convertOnnxWeights(initializer, &weights, ctx)
I found that I couldn't parse the onnx model, but I could use netron to open the onnx model。
# Load the Onnx model and parse it in order to populate the TensorRT network. with open(onnx_path, 'rb') as model: if not parser.parse(model.read()): print ('ERROR: Failed to parse the ONNX file.')
Have you solved this problem? Now I have the same problem. How did you solve it?THks!
(I tried with the yolov5s.pt in this repo) Facing the same issue. What should I do😥
New version updated, tracking yolov5 v3.0 release.
yolov5-3.0 used 》》》》》》ERROR: Failed to parse the ONNX file. In node -1 (parseGraph): UNSUPPORTED_NODE: Assertion failed: convertOnnxWeights(initializer, &weights, ctx)
python main.py:
` %960 : Float(1, 3, 80, 64, 81) = onnx::Slice(%918, %957, %958, %956, %959) # /home/zxzn/yolov5-tensorrt/yolo.py:42:0 %961 : Float(1, 3, 80, 64, 81) = onnx::Castto=1 # /home/zxzn/yolov5-tensorrt/yolo.py:42:0 %962 : Long() = onnx::Constant[value={-1}]() %963 : Long() = onnx::Constant[value={81}]() %964 : Tensor = onnx::Unsqueezeaxes=[0] %965 : Tensor = onnx::Unsqueezeaxes=[0] %966 : Tensor = onnx::Unsqueezeaxes=[0] %967 : Tensor = onnx::Concat[axis=0](%964, %965, %966) %968 : Float(1, 15360, 81) = onnx::Reshape(%961, %967) # /home/zxzn/yolov5-tensorrt/yolo.py:42:0 %969 : Float(1, 15360, 85) = onnx::Concat[axis=-1](%955, %968) # /home/zxzn/yolov5-tensorrt/yolo.py:44:0 %prediction : Float(1, 20160, 85) = onnx::Concat[axis=1](%751, %860, %969) # /home/zxzn/yolov5-tensorrt/yolo.py:46:0 return (%prediction)
段错误 (核心已转储) `
i use : python -m onnxsim yolov5_1.onnx outputmodel.onnx
Simplifying... Checking 0/3... Checking 1/3... Checking 2/3... Ok!
A new onnx model will be generated. What should I do nextChange the name of the new onnx model to yolov5_ 1. Onnx, and then log off the following code?
` # export_onnx(model, batch_size)
simplify_onnx(onnx_path)`
A new error will be reported after running
ERROR: Failed to parse the ONNX file. In node -1 (parseGraph): UNSUPPORTED_NODE: Assertion failed: convertOnnxWeights(initializer, &weights, ctx)
What should I do? Thank you for your help
This is my environment: Package Version
appdirs 1.4.4 certifi 2020.6.20 cycler 0.10.0 decorator 4.4.2 graphsurgeon 0.4.1 kiwisolver 1.2.0 Mako 1.1.3 MarkupSafe 1.1.1 matplotlib 3.3.0 numpy 1.19.0 onnx 1.6.0 onnx-simplifier 0.2.9 onnxruntime 1.4.0 opencv-python 4.3.0.36 Pillow 7.2.0 pip 20.1.1 protobuf 3.12.2 pycuda 2019.1.2 pyparsing 2.4.7 python-dateutil 2.8.1 pytools 2020.3.1 PyYAML 5.3.1 scipy 1.5.1 setuptools 49.2.0.post20200714 six 1.15.0 tensorrt 7.0.0.11 torch 1.4.0 torchvision 0.5.0 tqdm 4.48.0 typing-extensions 3.7.4.2 uff 0.6.5 wheel 0.34.2