triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.39k stars 1.49k forks source link

What is the latest triton server release version available for jetpack 4.6.4 #7611

Open HuseyinSaidKoca opened 2 months ago

HuseyinSaidKoca commented 2 months ago

I know that 2.18+ supports pytorch and we want to use that. jetson nano have jetpack 4.6.4 latest version. can this version install triton 2.20

we need pytorch and python backend supports

rsemihkoca commented 2 months ago

@krishung5 do you have any info ? Thank you in advance

krishung5 commented 2 months ago

@nv-kmcgill53 Do you know where we can find the support matrix of Triton and jetpack version?

rsemihkoca commented 2 months ago

Currently we used 2.19 because 2.20 required cuda 11 and it does not install on jetson nano