onnx / onnx-tensorrt

ONNX-TensorRT: TensorRT backend for ONNX
Apache License 2.0
2.85k stars 539 forks source link

[5] Assertion failed: tensors.count(input_name) #540

Closed MuhammadAsadJaved closed 3 years ago

MuhammadAsadJaved commented 3 years ago

Hi, I am have converted a custom Yolov3-TensorFlow (it take two inputs and then performs feature fusion and finally object detection) model from frozen_graph.pb to .onnx using tensorflow-onnx. Now I am trying to convert this .onnx to .trt using

onnx2trt modelIn/model.onnx -o modelOut/model.trt

and I got this error


Input filename: modelIn/model.onnx ONNX IR version: 0.0.6 Opset version: 11 Producer name: tf2onnx Producer version: 1.6.3 Domain:
Model version: 0 Doc string:

Parsing model While parsing node number 1 [Conv -> "lwir_darknet/conv0/batch_normalization/FusedBatchNormV3:0"]: ERROR: /home/littro/onnx-tensorrt/ModelImporter.cpp:537 In function importModel: [5] Assertion failed: tensors.count(input_name)

Note: I have converted .pb to .onnx on Titan V and now I am trying to convert .onnx to .trt on GTX 1080 Ti, Does the platform make any difference?

MuhammadAsadJaved commented 3 years ago

+1 Update: Now I also tried original TensorFlow-yolov3.ckpt > .pb > .onnx and then trying to convert this .onnx to .trt and it have same problem.

You can also see my .pb and .onnx models from here. I have uploaded

https://drive.google.com/drive/folders/1uoCqNCMwNvrgW6TQ3Ox-3w_GM7Q8div5?usp=sharing

kevinch-nv commented 3 years ago

@MuhammadAsadJaved it looks you have a TRT engine generated and uploaded in that google drive link. In addition I did not encounter any errors converting the provided onnx model into TRT with TensorRT 7.2. Can this issue be closed?

MuhammadAsadJaved commented 3 years ago

Ok Thank you