linghu8812 / tensorrt_inference

696 stars 205 forks source link

Can you help deploy the Yolov7-based inference model in C++? #167

Closed Egorundel closed 1 year ago

Egorundel commented 1 year ago

Hi, I'm a beginner and just getting to know the world of TensorRT. How to run TensorRT Inference C++ from the onnx model specifically for Yolov7?

Can you please describe the specific steps of deploying the inference model in C++.

I have a custom onnx model (Yolov7) with my weights and appropriate trt-file.

I use Ubuntu 20.04, ICE Clion, TensorRT 7.2.2.3.

Thank you in advance for your help!

@linghu8812

Egorundel commented 1 year ago

I figured out the situation