Qengineering / Jetson-Nano-Ubuntu-20-image

Jetson Nano with Ubuntu 20.04 image
https://qengineering.eu/install-ubuntu-20.04-on-jetson-nano.html
BSD 3-Clause "New" or "Revised" License
733 stars 76 forks source link

is that possible we could have an image with jetpack5.0 and cuda 11? #103

Closed tjvvai closed 3 days ago

tjvvai commented 4 days ago

thank you so much for this image.

I was wondering whether we could get another image with cuda 11 installed. I was trying to run yolov10 model on jetson nano and managed to convert .pt model to .onnx format. However, when I convert to .engine model, it says mod is not supported on tensorrt 8.0.6.1. But to get a newer version of tensorRT, we need a newer cuda version. So Here I come. I asked chatgpt, it says, cuda 11.4 might be supported on jetson nano.

Is it possible to have an image with jetpack5.0 and cuda 11 built-in? Thanks in advance.

From chat gpt


As of the latest updates (November 2024), the Jetson Nano supports CUDA 11.x versions. Here's a more detailed breakdown:

CUDA Support by JetPack Version:
JetPack 4.6 (for Ubuntu 18.04):

Supports CUDA 10.2.
This is one of the most stable versions for the Jetson Nano.
JetPack 4.6.1 (for Ubuntu 18.04):

Continues to support CUDA 10.2.
JetPack 5.0 (for Ubuntu 20.04):

Introduces support for CUDA 11.4.
This version is compatible with the latest software tools, such as TensorRT, and supports newer libraries like TensorFlow and PyTorch.
JetPack 5.0.2 (latest at the time of writing, released in 2024):

Supports CUDA 11.4 and comes with additional updates like better performance, bug fixes, and new tools.
Qengineering commented 3 days ago

chatGPT is a great tool. We use it every day. However, it makes mistakes. If someone says on this website that he installied CUDA 11 on its nano, chatGPT follows the auteur without proper checking. Better to use Perplexity AI. The same AI power, but it shows the references it cited. Now you can check the information later.

CUDA 11 can not be installed on a Jetson Nano. The low-level driver infrastructure lacks the support needed for the 11 version. Despite NVIDIA clear statement, it isn't possible, many people has tried. No one succeeded.

tjvvai commented 3 days ago

Thank you so much for the confirmation. I definitely believe in you more than ChatGPT. I guess the ChatGPT is still not that trustworthy, is it?

So currently, we managed to convert yolov10.pt to .onnx, but cannot convert .onnx to .engine due to the mod operation not supported on TensorRT 8.0.1.6 as shown in the image below.

May I ask have you met similar issues before? Someone suggests writing a customized plugin to support that operation. what's your opinion?

thanks in advance!

Screenshot from 2024-11-19 01-48-58

Qengineering commented 3 days ago

And that's always the nature of things. While one piece of technology progresses and the other stands on hold, a day comes when the two don't match anymore. The new onnx opset is needed to describe the YoloV10 model (the mod operator, for instance). The trtexec found on the Nano doesn't support the latest opset. You can try modifying the onnx model so it doesn't need the latest operations. That's hard and error prune. Or you use an 'older' model, like YoloV8 or YoloV5. There isn't that much performance gain between YoloV10 and V8. The major difference is integrating the NMS function into the YoloV10 model, while YoloV8 has the same functionality in the post-processing.

tjvvai commented 3 days ago

Most of the time, on general topics, ChatGPT will do its work. When it comes to some niche details, expertise is still required. On this topic, you are definitely better than ChatGPT.

Thanks for the advice. For the Yolo model, I was wondering whether the yolov8 model still contains the modoperation, considering the modulo division is quite basic. I will have a test with the yolov8 model on Jetson Nano and let you know the results.

One thing we might need your knowledge of is that we are trying to find some edge devices that can support Cuda 11. We are thinking about Jetson Orin Nano 8GB. Do you have any better recommendations?

Thanks again!

Qengineering commented 3 days ago

You have several options: Jetson Orin Nano. Perfect. However expensive. Rock 5C. The NPU hits 6 TOP. Cheap and very powerful. Use it all the time. China made. Raspberry Pi + AI Hat (Hailo). Hits 26 TOP. Modest in price. However, your bound to the Hailo software. Not the user friendliest.

tjvvai commented 3 days ago

Brilliant, thank you so much.

Currently, we are verifying our pipeline with the deployment on the farms. Making it work is our priority for now, so we might go for Jetson Orin Nano for that purpose. We could think about price and other factors in the scaling stage.

Many thanks for all your valuable opinions and expertise. It is very nice to chat with you!

Hope you have a wonderful day.