dusty-nv / ros_deep_learning

Deep learning inference nodes for ROS / ROS2 with support for NVIDIA Jetson and TensorRT
862 stars 258 forks source link

Jetson Orin/Jetson Nano Ros Deep Learning #120

Closed Fibo27 closed 1 year ago

Fibo27 commented 1 year ago

I have a set-up that consists of: 1) A Robot running on Jetson Nano with jetpack 4.6 and Foxy. There is a Rpi camera connected and I am able to transmit video using rtp protocol. I am using Gstreamer pipeline and the code used in my ros node is

pipeline_str = 'nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280,height=720,framerate=120/1 ! nvjpegenc ! rtpjpegpay ! udpsink host=192.xxx.xxx.xxx port=1234 sync=false async=false'

2) I have a Jetson Orin Nano and I have set-up jetson-inference. It is running Jetpack 5.1.1 and I have also set-up Ros2 Foxy. On a Terminal I use: "gst-launch-1.0 udpsrc port=1234 ! Application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink" and I am able to view the video being transmitted by the ros node running on Jetson Nano on the Robot. In jetson-inference, I can use one of the examples detectnet.py to capture data from my USB cam and run the detetion algo.

I would like to use this detection algo to detect objects being sent across over rtp from my Robot. However, if i simply use ./detetcnet.py rtp://@:1234 it shows an error. I think it is the encoding that is causing this issue. even when i change the codec to JPEG as one of the videoviewer variables it does not work. Can you please suggest a workaround?

3) Given this set-up what is the best "out of box" set-up that i can have so that i can use the video stream being sent out from the RObot to the remote Jetson Orin Nano set-up? Thanks

Fibo27 commented 1 year ago

Hi - this was resolved - i changed the --input-codec flag to mjpeg!

dusty-nv commented 1 year ago

The input_codec ROS param from the launch files should now be fixed in commit https://github.com/dusty-nv/ros_deep_learning/commit/8f55223ffc252a57fbe3bd98a07b3e026f5a00f8

Alas, this change has not been propagated into the container builds yet.