Closed tldrafael closed 1 year ago
@tldrafael It seems like you are trying to load a TRT model using PyTorch backend which is incorrect. You should use tensorrt backend for TRT models. You need to use backend: "tensorrt"
and save the model as model.plan
file instead of model.pt
.
thank you @Tabrizian, it worked! I added the backend: "tensorrt"
in the config.pbtxt file
, and changed the triton server docker image to nvcr.io/nvidia/tritonserver:22.01-py3
.
Hi, I'm getting the following error to use the model converted by
torch2trt
in the Triton server: model_repository_manager.cc:1152] failed to load 'resnet50' version 1: Internal: failed to load model 'resnet50': PytorchStreamReader failed reading zip archive: failed finding central directory.The entire log: triton.err.log.
I have seen the issues #1264 and #212. My doubt is: Am I missing some step here?
Reproducible example
I used the TensorRT docker image to generate the trt model:
Then, in the
python
console inside the container:Then, I got his error when I load the triton server: