Closed mkh-github closed 6 years ago
Update -- it did finally initialize but took almost 4 minutes. When I run on the other boxes it initializes in 3-4 seconds.
Sorry for the long latency in answering.
It's often due to either model preparation and/or CUDA JIT when you have a GPU arch that your code (or tensorrt) was not compiled against.
I have successfully run tensorrt_server on several boxes. I have am having trouble getting it to run on one of our boxes. It is Ubuntu Linux 14.04 with docker-engine 1.13.1 and nvidia-docker 1. After building with Dockerfile.tensorrt_server, when I try to run the server it just hangs after the log message "Initializing TensorRT classifiers". Any suggestions on how I can determine the problem? I'm stumped!