triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.07k stars 1.45k forks source link

What is the latest triton server release version available for jetpack 4.6.4 #7611

Open HuseyinSaidKoca opened 1 week ago

HuseyinSaidKoca commented 1 week ago

I know that 2.18+ supports pytorch and we want to use that. jetson nano have jetpack 4.6.4 latest version. can this version install triton 2.20

we need pytorch and python backend supports

rsemihkoca commented 1 week ago

@krishung5 do you have any info ? Thank you in advance

krishung5 commented 3 days ago

@nv-kmcgill53 Do you know where we can find the support matrix of Triton and jetpack version?

rsemihkoca commented 3 days ago

Currently we used 2.19 because 2.20 required cuda 11 and it does not install on jetson nano