isarsoft / yolov4-triton-tensorrt

This repository deploys YOLOv4 as an optimized TensorRT engine to Triton Inference Server
http://www.isarsoft.com
Other
276 stars 63 forks source link

how to deploy yolov5 model with triton inference server #30

Closed Amanda-Barbara closed 3 years ago

Amanda-Barbara commented 3 years ago

can you share the tutorial about https://github.com/ultralytics/yolov5 trained model with triton inference server? thanks.

philipp-schmidt commented 3 years ago

It's on the roadmap - but I don't think it will happen anytime soon. However, there are multiple yolov5 implementations for TensorRT which are identical in implementation to the yolov4 in this repo that you can deploy exactly like its described in the README of this repo by just replacing yolov4: https://github.com/wang-xinyu/tensorrtx/tree/master/yolov5

Amanda-Barbara commented 3 years ago

@philipp-schmidt I will try it, thanks very much.

huynhbaobk commented 2 years ago

I have already tried yolov5 with triton https://github.com/huynhbaobk/tensorrt-triton-yolov5

tienluongngoc commented 2 years ago

@Amanda-Barbara you can try this https://github.com/tienluongngoc/yolov5_triton_inference_server

danudeep90 commented 1 year ago

@tienluongngoc, @huynhbaobk : Can you share the config.pbtxt file that you used for the model that you used. I am using this but facing issue in postprocess function

name: "yolov5m_torch" platform: "pytorch_libtorch" max_batch_size : 0 input [ { name: "INPUT0" data_type: TYPE_FP32 dims: [1, 3, 640, 640] } ] output [ { name: "OUTPUT0" data_type: TYPE_FP32 dims: [1, 25200 , 85] } ] default_model_filename: "yolov5m.torchscript"