Closed flian2 closed 2 weeks ago
Hi @flian2 , I've tried the reproducer in 24.08 and everything went through. Could you please check 24.08
as well?
Hi @oandreeva-nv , thanks! I'll give it a try.
Hi @oandreeva-nv, I tried this image nvcr.io/nvidia/tritonserver:24.08-py3
but I see the same error when running the client code. Could you tell me your tritonclient
version when you reproduced? Thanks!
Any updates? I'm having the same issue with these versions: 24.08-py3, 24.09-py3 24.02-py3. Maybe is it related with aws? I'm also running my docker container inside an ec2 with the amazon linux ami.
Ok, apparently i was able to find the issue. Maybe your conda pack is installing a version of numpy that is conflicting with torch or a different package. In my case i was packing a numpy version higher than 2, which it isn't supported by torch.
Hey @mauriciocm9 @flian2,
Hopefully, this PR should resolve the issue you are experiencing allowing you to install numpy 1.x
or numpy 2.x
in your environment without any issues from the python backend. This fix will tentatively be a part of the 24.11 release.
For now, as @mauriciocm9 recommended, I would restrict the version of numpy in your conda environment to: numpy<2
Closing this issue due to inactivity. Please feel free to reopen if needed.
Description
I’m using triton python_backend to run the pytorch example in the python_backend repo. I packaged the pytroch dependencies into a conda environment and can load the model successfully. However when running the client inference script provided in the repo, I encounter the following error when trying to get the output from the httpclient response. It seems that the response is empty.
I tried running the add_sub example and the response.as_numpy("OUTPUT0") worked fine with the expected output.
Triton Information What version of Triton are you using?
server_version 2.41.0
Are you using the Triton container or did you build it yourself?
I’m using a sagemaker docker image for triton server: 763104351884.dkr.ecr.us-east-1.amazonaws.com/sagemaker-tritonserver:23.12-py3
To Reproduce
The model.py
config.pbtxt:
The client.py:
Expected behavior I expect that the client script
will convert the output tensors in the http response to numpy array.