Open blackhu opened 10 months ago
CC'ing @kthui and @nv-kmcgill53 , by any chance, have you seen this issue before?
Internal ticket: [5946]
Hi @blackhu,
docker run -it --rm --gpus all --net=host nvcr.io/nvidia/tritonserver:23.11-py3-igpu-sdk docker run -it --rm --gpus all --net=host nvcr.io/nvidia/tritonserver:23.11-py3-sdk
I am wondering if you meant to download nvcr.io/nvidia/tritonserver:23.11-py3-igpu
container when you downloaded nvcr.io/nvidia/tritonserver:23.11-py3-sdk
?
The containers published for 23.11 for iGPU are:
on NGC https://catalog.ngc.nvidia.com/orgs/nvidia/containers/tritonserver/tags
I also encountered this issue, but it can run normally on Jetson 5.1.2.
Description A clear and concise description of what the bug is.
The Jetson Nano(JetPack4.6.1) cannot use GPU in the Triton inference container
Triton Information What version of Triton are you using?
Are you using the Triton container or did you build it yourself?
Triton container
To Reproduce Steps to reproduce the behavior.
Describe the models (framework, inputs, outputs), ideally include the model configuration file (if using an ensemble include the model configuration file for that as well).
Expected behavior A clear and concise description of what you expected to happen.