hamdiboukamcha / Yolo-V10-cpp-TensorRT

The YOLOv10 C++ TensorRT Project in C++ and optimized using NVIDIA TensorRT
GNU General Public License v3.0
13 stars 3 forks source link

Do you use the onnx file from https://github.com/THU-MIG/yolov10? #1

Open rlewkowicz opened 1 month ago

rlewkowicz commented 1 month ago

I've run through a handful of c++ implementations, including yours. I have the yolov10 model trained and tested. I think I'm having problems with the onnx export from that toolchain.

How do you generate your onnx file?

hamdiboukamcha commented 1 month ago

Yes, that could be the issue. To generate a compatible ONNX file, you can use the following command: yolo export model=yolov10s.pt format=onnx opset=13 simplify

This will ensure the ONNX file is properly exported with the necessary configurations. Let me know if you need any further assistance!

rlewkowicz commented 1 month ago

I've tried thu-mig, and the native yolo toolchain. Native export gives me an error about a magic tag when loading the engine, I can build the engine, but the accuracy is beyond poor. Then thu repo works, but the accuracy is also bad.

If you use polygraphy, and do:

polygraphy run .\best.onnx --trt --onnxrt

Do you get errors with difference tolerance?