duckietown / gym-duckietown

Self-driving car simulator for the Duckietown universe
http://duckietown.org
Other
45 stars 16 forks source link

Install CUDA, PyTorch or any DNN Library on the duckiebot itself? #264

Open bishoyroufael opened 2 years ago

bishoyroufael commented 2 years ago

I have a DB21M assembled, getting confused about how to run neural networks on the bot itself. I tried SSHing into the bot but didn't find any NVIDIA drivers installed or CUDA installed? Is there a way to do that easily? Can't find anything useful in the docs talking about this.

From the NVIDIA docs, there should be a way to install JetPack and get some neural networks running out from their examples presented here. I want to use that with ROS on the duckiebot

FelixMildon commented 2 years ago

Can this please be explained @tanij , this understanding would help allot.

bishoyroufael commented 2 years ago

After going through a lot of research and pain. I was able to make something work here as part of my thesis project. Feel free to use the Dockerfile in your own project.

It uses dustynv/jetson-inference as a base container which is basically a container having CUDA, PyTorch and some cool DNN models there to be used straight away. It also has ROS melodic on top of that where you can set ROS_MASTER_URI with the duckiebot IP for communication to work.

If you just want PyTorch you can create rather derive from l4t-pytorch. More details about that here.