isarsoft / yolov4-triton-tensorrt

This repository deploys YOLOv4 as an optimized TensorRT engine to Triton Inference Server
http://www.isarsoft.com
Other
276 stars 63 forks source link

How to deploy multiple customized yolov4 models on the triton server #42

Closed Amanda-Barbara closed 3 years ago

Amanda-Barbara commented 3 years ago

@philipp-schmidt I can deploy a customized yolov4 model with plugin on the triton sever , Now I want to deploy multiple customized yolov4 models with different CLASS_NUM on the triton server, How should I do it ? Thanks .

philipp-schmidt commented 3 years ago

I believe you should be just able to do that. You compile your code for the first engine and create the engine, then you change the class_num, compile again and build your second engine.

The plugin has been changed recently to be able to support multiple models even if they differ in their settings. You no longer need multiple yolo layer plugins if the parameters change.

Amanda-Barbara commented 3 years ago

@philipp-schmidt yes, I have seen your new version, you have done it, Thanks!