NVIDIA-ISAAC-ROS / isaac_ros_argus_camera

ROS 2 packages based on NVIDIA libArgus library for NVIDIA-accelerated CSI camera support.
https://developer.nvidia.com/isaac-ros-gems
Apache License 2.0
64 stars 12 forks source link

Performance much worse than GStreamer pipeline with nvarguscamera source #9

Closed carTloyal123 closed 1 year ago

carTloyal123 commented 2 years ago

I am on a Jetson Nano and the isaac_ros_argus_camera nodes are much slower than using something like a gstreamer pipeline shown below. I get about 8fps from isaac where I get closer to 60 using a GStreamer pipeline shown below. Any suggestions on what might be causing this or am I missing something else here? Using isaac as described in the quick start guide.

gordongrigor commented 2 years ago

Is this reported on ROS2 Foxy?

In ROS2 Foxy (and other versions) topics from the camera node would be moved from GPU to CPU memory, and then back from CPU to GPU for processing in a downstream node in the graph. In g-streamer, a similar pipeline would operate entirely in HW. ROS2 Humble introduces a new feature that allows topics to be adapted to a format that works with hardware acceleration REP2007, which is implemented in the latest release.

Can you share more details on the camera used? We list the camera's we are testing against in the list of reference cameras

carTloyal123 commented 2 years ago

Hey there, thank you for the reply. I am running ROS2 Galactic which I assume has the same issue as Foxy. Is the only way to address the data format conversions to update to Humble? I am using the CSI IMX219 on the Jetson nano. I am making a Gstreamer pipeline then using the OpenCV video-capture object to grab images.

On a side note, does anyone know much about image pipelines using nvidia components? I have been trying to get a pipeline working with Gstreamer that converts the CSI raw feed into an RGB image. Basically without using the native videoconvert since it is slow on the cpu. Thanks!

gordongrigor commented 2 years ago

Thanks for confirming Galactic. What was the pipeline referred to in the original post, as it's not shown?

In Humble, Isaac creates graphs using HW acceleration from CSI camera input through RGB output if that is what is needed as an output.

You'll need to asked on the NV developer forum about raw CSI bayer camera conversion to RGB in G-Streamer which I suspect will use nvgstcapture, as this area is focused on Isaac in ROS.

carTloyal123 commented 2 years ago

Hi Gordon, thank you for the response!

My goal is to convert CSI camera into RGB as fast as possible on the Jetson Nano to get the highest framerate possible with the lowest CPU usage. I would like to achieve around 45 fps if possible at a resolution of 720x480. I am trying to do this using the OpenCV videocapture object and a Gstreamer pipeline but I have to include a videoconvert element in my pipeline or the data stream will not output to the appsink. The pipeline is shown below:

nvarguscamersrc ! video/x-raw(memory:NVMM), width=1280, height=720, framerate=30/1 ! nvvidconv flip-method=0 ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink
gordongrigor commented 1 year ago

This question is best asked in the Jetson forums.

At the highest level, I suspect deepstream can take the 720p camera stream, downscale to 480p in HW, and transfer from the GPU to the CPU for you. A similar ROS pipeline should work from Argus camera node at 720p to image_proc to downscale to 480p then publish the topic from GPU to CPU for processing.

gorghino commented 1 year ago

Hi @carTloyal123 Did you fixed it eventually? Did you try Humble?