isarsoft / yolov4-triton-tensorrt

This repository deploys YOLOv4 as an optimized TensorRT engine to Triton Inference Server
http://www.isarsoft.com
Other
277 stars 63 forks source link

C++ client infer script needed [feature] #1

Closed sudapure closed 3 years ago

sudapure commented 4 years ago

Most of the code in the repo, looks for building network and serializing plan file, but without client infer script it looks incomplete, it would be interesting to have infer client script as consuming triton server as shared lib, instead of calling over gRPC or HTTP.

philipp-schmidt commented 3 years ago

@sudapure It ain't c++ (and frankly I'm not sure I will have time to code a c++ client, but it is VERY similar code and all the post- and preprocessing c++ code can be copied from the tensorrtx repo), but check out the working python client example under clients/python

philipp-schmidt commented 3 years ago

If you definitely need a shared lib, you can start from here and just develop the very same application in c++ and look at the python client for steps. OpenCV code is practically identical function calls and post- and preprocessing can be copied from tensorrtx implementation

sudapure commented 3 years ago

@philipp-schmidt , thanks for the commit i will go through official tensorRT doc. for shared lib support, however will it be possible to make use of the per-processing and post-processing methods here, for yolov5 implementation in tensortRT

philipp-schmidt commented 3 years ago

Hi, I don't understand the question sorry. You want to use the pre- and postprocessing for yolov5?

sudapure commented 3 years ago

@philipp-schmidt , yes will it be possible to use pre- and post-processing for yolov5

philipp-schmidt commented 3 years ago

I don't know what the output of yolov5 is, you have to look that up. yolov4 performs as well (if not better) as yolov5.

sudapure commented 3 years ago

ok, thanks. i will look into it!