dusty-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
MIT License
2.14k stars 443 forks source link

Can these containers(from jetson-containers repo) run examples from jetson-inference repo? #163

Open medphisiker opened 2 years ago

medphisiker commented 2 years ago

Hello, Dusty.

There is a folder "/.jetson-inference" inside the container with ROS 2 foxy and PyTorch from "jetson-containers" repo. I wanted to ask if it is possible to run demos with neural networks from your repository "jetson-inference"(https://github.com/dusty-nv/jetson-inference) from docker image with ROS 2 foxy and PyTorch from "jetson-containers" repo? For example from image "foxy-pytorch-l4t-r32.7.1"?

The examples from your "ros_deep_learning"(https://github.com/dusty-nv/ros_deep_learning) repository are also very interesting. In this repository everything is installing and running inside Jetson Nano OS.

Is it possible to run ROS 2 nodes for deep learning from the "ros_deep_learning" repository from docker image with ROS2 and PyTorch from "jetson-containers" repo? For example from image "foxy-pytorch-l4t-r32.7.1"?

I'm quite a beginner, I apologize if the answers to these questions are obvious to everyone =)

dusty-nv commented 2 years ago

Hi @medphisiker, yes you should be able to run the jetson-inference applications straight away in the ros:*pytorch* containers. That's because I use jetson-inference container as a base of those containers (the jetson-inference container coincidentally includes pytorch)

You would need to install the ros_deep_learning repo into the container though. For an example of doing that, see here: https://github.com/dusty-nv/jetbot_ros/blob/d8e5ee1b17f5c66d038017b86f8920a496197ea9/Dockerfile#L123

medphisiker commented 2 years ago

Thank you for the quick response and instructions) I will experiment)

medphisiker commented 2 years ago

"I use jetson-inference container as a base of those containers (the jetson-inference container coincidentally includes pytorch)" Yes, I already have docker image with ROS2 Foxy and PyTorch from your repository foxy-pytorch-l4t-r32.7.1

When I try run docker with jetson-inference image, docker pull all layers for jetson-inference image from foxy-pytorch-l4t-r32.7.1 image. Both of them works the same. Examples with detectnet on images works well. I run foxy-pytorch-l4t-r32.7.1 by "docker/run.sh" from jetson-inference repository by setting this container by "-с" parameter. With such a start, ROS2 also performs sourcing itself. All works well.

The docker-image (foxy-pytorch-l4t-r32.7.1) also have a problem that was discussed in this issue link tyo issue. Interestingly, that jetson-inference docker-image for Jetpack 4.6 did not have such problem.

But this is another story =). Thank you for you help and fast reply. There is the answer, and I think that we can close this issue )