NVIDIA / gpu-rest-engine

A REST API for Caffe using Docker and Go
BSD 3-Clause "New" or "Revised" License
421 stars 94 forks source link

tensorrt_server hangs "Initializing Tensorrt classifiers" #30

Closed mkh-github closed 6 years ago

mkh-github commented 6 years ago

I have successfully run tensorrt_server on several boxes. I have am having trouble getting it to run on one of our boxes. It is Ubuntu Linux 14.04 with docker-engine 1.13.1 and nvidia-docker 1. After building with Dockerfile.tensorrt_server, when I try to run the server it just hangs after the log message "Initializing TensorRT classifiers". Any suggestions on how I can determine the problem? I'm stumped!

mkh-github commented 6 years ago

Update -- it did finally initialize but took almost 4 minutes. When I run on the other boxes it initializes in 3-4 seconds.

flx42 commented 6 years ago

Sorry for the long latency in answering.

It's often due to either model preparation and/or CUDA JIT when you have a GPU arch that your code (or tensorrt) was not compiled against.