Closed yuanhs1996 closed 6 months ago
Hi @yuanhs1996, my understanding is that you want to inference in TensorRT? If so I would recommend exporting the onnx directly from the pt
format, and build engine with TensorRT APIs, either python/C++ API would work. Here is an example:
https://github.com/cyrusbehr/tensorrt-cpp-api?tab=readme-ov-file#sanity-check
Let me know if any questions :)
Thank you for your answer, it was very helpful.
Hello, I would like to ask how to convert weights from torchscript format to engine format. After the conversion, do I need to modify the program in order to run it?