Closed ProgrammerZhujinming closed 3 months ago
I offer some pictures about this problem. I hope it will give you some useful information. Thank you.
The image you're using doesn't include ONNXRuntime backend. Please use nvcr.io/nvidia/tritonserver:24.06-py3
which contains the ONNXRuntime backend. Thanks.
Hi, I'm sorry to bother you, but I cannot solve this problem. I can successfully deploy torch model, but when I deploy onnx model I got this problem. I also try the densenet onnx model from the official repository-serve(https://github.com/triton-inference-server)
My image version is 24.06-pyt-python-py3, and my torch version is 2.3.1-cuda11.8, onnx 1.16.1 and onnxruntime 1.18.1. So, could you tell me what happened and how can I solve this problem. Thank you very much for your reply.