isarsoft / yolov4-triton-tensorrt

This repository deploys YOLOv4 as an optimized TensorRT engine to Triton Inference Server
http://www.isarsoft.com
Other
276 stars 63 forks source link

C++ program #41

Closed YLongJin closed 2 years ago

YLongJin commented 3 years ago

I have read the officail document,but new I already not clearly this triton. I want to know is it support ,in my C++ TRT program,in local GPU, deploy .eng file to infer. I just want implment the data parallelistion, like you program If you can help me, i will pleasure .

philipp-schmidt commented 3 years ago

You can have a look at how to do this in a different repo which comes with c++ code for this:

https://github.com/wang-xinyu/tensorrtx/tree/master/yolov4

olibartfast commented 3 years ago

Hi if you're interested I tried to implement on my repo here a basic triton c++ client example with inference code from wang-xinyu repo and the .plan model generated with this repo code

philipp-schmidt commented 3 years ago

Hi @olibartfast I never got the time to implement the c++ client even though Triton has very good documentation and examples. Would you mind PR'ing your code into this repo? There is room under the folder clients/c++. I will of course add you to a list of contributors in the README.md if you want. Would be very much appreciated. Cheers.

philipp-schmidt commented 2 years ago

C++ client example now available for v1.3.0 under clients/c++