Closed Egorundel closed 1 year ago
Hi, I'm a beginner and just getting to know the world of TensorRT. How to run TensorRT Inference C++ from the onnx model specifically for Yolov7?
Can you please describe the specific steps of deploying the inference model in C++.
I have a custom onnx model (Yolov7) with my weights and appropriate trt-file.
I use Ubuntu 20.04, ICE Clion, TensorRT 7.2.2.3.
Thank you in advance for your help!
@linghu8812
I figured out the situation
Hi, I'm a beginner and just getting to know the world of TensorRT. How to run TensorRT Inference C++ from the onnx model specifically for Yolov7?
Can you please describe the specific steps of deploying the inference model in C++.
I have a custom onnx model (Yolov7) with my weights and appropriate trt-file.
I use Ubuntu 20.04, ICE Clion, TensorRT 7.2.2.3.
Thank you in advance for your help!
@linghu8812