dusty-nv / jetson-inference

Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
https://developer.nvidia.com/embedded/twodaystoademo
MIT License
7.75k stars 2.97k forks source link

Modifications to detectNet.cpp for use with Yolo NAS or Yolov8 #1797

Open lars-nagel opened 7 months ago

lars-nagel commented 7 months ago

Since the newer Yolo architectures are not officially supported for object detection: I wonder if anyone from the community successfully modified the pre- and post processing in detectNet.cpp to use it with Yolo NAS or Yolov8 and is willing to share how they did it? I would love to use those with jetson-inference, which is a great framework per se. Any help is appreciated!

pq53ui commented 6 months ago

I put some time into it. I think I was able to modify the onnx models, to have the outputs compatible with detectNet. But since I am using the JetsonNano Dev Kit (not Orin) the YOLO models just can't work due to unsupported operations within the model.

If that's not your issue and you just need to transform the model, this post might help you https://vilsonrodrigues.medium.com/add-non-maximum-suppression-nms-to-object-detection-model-using-onnx-7639a698cf05