Open habjoel opened 2 years ago
You should till be able to build the latest ORT release from source using CUDA 10.2 (and not rely on older ORT versions because of the Jetson CUDA limitation). ONNX is statically linked into ORT and you shouldn't have to have a specific ONNX installed.
CC: @jywu-msft to keep me honest
yes, you can follow the build instructions and build against CUDA 10.2 on Jetson without issue. For convenience you can download pre-built onnxruntime 1.11 python packages for JetPack 4.6.1 from Nvidia's Jetson Zoo
You should till be able to build the latest ORT release from source using CUDA 10.2 (and not rely on older ORT versions because of the Jetson CUDA limitation). ONNX is statically linked into ORT and you shouldn't have to have a specific ONNX installed.
CC: @jywu-msft to keep me honest
Hi my Jetson AGX Xavier is downloaded with Jetpack 5.0.1 and coming with CUDA 11.4, but when I tried to run onnxruntime with cuda or tensorrt it keeps getting error. So do I need to downgrade my Jetpack to 4.6.1 to run onnxruntime? Thanks before @hariharans29 @jywu-msft
@zogojogo You probably need to build the runtime with custom parameters: https://forums.developer.nvidia.com/t/issue-using-onnxruntime-with-cudaexecutionprovider-on-orin/219457/8
You can now use the OnnxRuntime 1.12.1 package from https://elinux.org/Jetson_Zoo#ONNX_Runtime which is built for JetPack 5.0.x and supports Jetson Orin.
Hi there,
I am a bit confused about onnx and ort version compatibilities and hope to find some help here. I am intending to run an instance segmentation model on my Nvidia Jetson AGX Xavier (Jetpack 4.6.1). It comes with CUDA-10.2 and it cannot run a newer version like CUDA 11.x yet. (According to https://forums.developer.nvidia.com/t/installing-cuda-11-x-on-jetson-nano/169109)
So, am I assuming correctly that I can therefore only run ort 1.5-1.6 on my Jetson if I want it to work with CUDA? (as stated here: https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html) and does this further mean that I need to have ONNX of version <= 1.8 installed? (as stated here: https://github.com/microsoft/onnxruntime/blob/master/docs/Versioning.md)
I am asking all of this because I converted a mask-rcnn model from mmdetection to ONNX format and trying to run the model in an onnx runtime python script on the Jetson does not give me the correct results.
generally: can I run any exported .onnx model on any onnx and ort version without issues? or does it depend on what onnx/ort version was used for exporting?
Thank you for your help!