NVIDIA / gpu-rest-engine

A REST API for Caffe using Docker and Go
BSD 3-Clause "New" or "Revised" License
421 stars 94 forks source link

TensorRT 6 for detection + classification #41

Open mkh-github opened 5 years ago

mkh-github commented 5 years ago

Using gpu-rest-engine, i created a service that could detect, classify -or- detect and then classify. First step is to do detection with SSD using of a single class (other than background), and then take crops from the bounding boxes and run through a googlenet classiifer.

Would I be able to do the same with TRT inference server? or would I need to use https://github.com/NVIDIA/tensorrt-laboratory?

Thanks!! the gpu-rest-engine has worked very well, but wanting to upgrade to TRT 6 and having issues.

flx42 commented 5 years ago

Hello @mkh-github. You should ask them directly on their GitHub. TRTIS is open source too: https://github.com/NVIDIA/tensorrt-inference-server

They will be in a better position to answer you.

ryanolson commented 4 years ago

TRTIS has an ensemble API which allows you to chain models together.