dusty-nv / ros_deep_learning

Deep learning inference nodes for ROS / ROS2 with support for NVIDIA Jetson and TensorRT
866 stars 258 forks source link

Running TensorRT model in ROS #67

Closed sisaha9 closed 3 years ago

sisaha9 commented 3 years ago

This could possibly be off topic but I haven't found much resources for running TensorRT model in ROS. I wanted to run a MaskRCNN model on my Jetson Xavier NX. I was initially using Detectron2 but it's inference was too slow (around 5FPS). On looking around it seemed that Detectron2 was not convertable to TensorRT yet. However Matterport's maskRCNN was (https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/sampleUffMaskRCNN). I am close to completing a TensorRT sample based off those instructions. I was wondering if there was any resource I could use to load a TensorRT model in ROS, take in camera input and publish the segmentation results much like how this project https://github.com/DavidFernandezChaves/Detectron2_ros/blob/master/msg/Result.msg, publishes in a custom message. Any guidance would be greatly appreciated

dusty-nv commented 3 years ago

Hi @sisaha9, we will be coming out with a MaskRCNN node for ROS at some point in the future.

Here is a package that runs PyTorch models in TensorRT for ROS2 - https://github.com/NVIDIA-AI-IOT/ros2_torch_trt

To make your own node from sampleUffMaskRCNN sample, you would essentially just wrap up that TensorRT sample code as a ROS node.