NVIDIA / TensorRT

NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
https://developer.nvidia.com/tensorrt
Apache License 2.0
10.56k stars 2.1k forks source link

onnx-to-tensorrt not working for custom ssd mobilenetv2 model #865

Closed 1208overlord closed 2 years ago

1208overlord commented 3 years ago

I have trained own ssd mobilenetv2 model with custom data. I thought to run it on Xavier AGX, and was going to convert to tensorrt. I thought tensorflow-onnx-tensorrt is the good way to do it. And then, successfully generated onnx file, but continued to fail to convert onnx to tensorrt. Many people other than me have same issue, but I didn't find the best solution for me. I tried with many ways that they suggest, but same error exist. What should I do to fix it?

1208overlord commented 3 years ago

Additional information I am using TensorRT 6.0.10 version and trained with tensorflow 1.15 version.

rajeevsrao commented 3 years ago

@1208overlord Can you provide more specifics about the issue you are seeing (error log) when parsing the ONNX model via TensorRT or share the ONNX model?

1208overlord commented 3 years ago

And then, I will share my model to be converted.

https://we.tl/t-ug58VPlMZq

Please help me with this conversion to Tensorrt

nvpohanh commented 2 years ago

@1208overlord Could you try TRT 8.2/8.4 and see if the issue still exists? If it does, we will debug it. Thanks

nvpohanh commented 2 years ago

Closing due to >14 days without activity. Please feel free to reopen if the issue still exists in TRT 8.4. Thanks