dusty-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
MIT License
2.32k stars 475 forks source link

ROS2 node using nano_llm #603

Closed Fibo27 closed 2 months ago

Fibo27 commented 2 months ago

@dusty-nv, any plans to create a ROS node using nano_llm like you did with ROS Deep Learning using Jetson Inference as the underlying program.

FYI, I have created a ROS node whereby a USB camera on a Robot running on Orin Nano transmits video feed using rtp and I access the rtp feed directly on an Orin NX which runs the nano_llm. Running the nano_llm on an NX is stretch on its compute capability, but it works,

I can envisage two scenarios: 1) A Robot using AGX Orin running the nano_llm thereby showcasing computing at the edge 2) A Robot running on a smaller compute like Orin Nano (as I have) and transmitting the feed as rtp which can then be deciphered on a faster compute platform (AGX Orin)

Thanks

PS: For 2) above, deploying the nano_llm ROS node on an x86 would be great. I know you do not have ROS Deep Learning on an x86 but as we have corresponded on the Jetson Inference thread, I have been able to create that on an x86, see this: https://github.com/dusty-nv/ros_deep_learning/issues/124

dusty-nv commented 2 months ago

Hi @Fibo27, yes check these out (I need to update the NanoLLM docs) - https://github.com/NVIDIA-AI-IOT/ros2_nanollm

There is the VLM node on there so far, but mean to add the LLM one soon (basically just a copy & paste) More importantly, we have the container combining nano_llm with ros2:humble (dustynv/nano_llm:humble-36.3.0)

That container includes jetson-inference and ros_deep_learning also. If you'd like to contribute to ros2_nanollm feel free to submit PR's 👍

Fibo27 commented 2 months ago

Great thanks