Open RoboEvangelist opened 4 months ago
Hi @RoboEvangelist, the C++
version has not been developed but is on the TODO list. Currently, you can try to export ONNX models and adopt TensorRT or other acceleration tools.
Hi @RoboEvangelist, the
C++
version has not been developed but is on the TODO list. Currently, you can try to export ONNX models and adopt TensorRT or other acceleration tools.
do we have a tensorrt demo?
@jzx-gooner https://github.com/PrinceP/tensorrt-cpp-for-onnx?tab=readme-ov-file#yolo-world
Thanks, broh. This is great!
@jzx-gooner https://github.com/PrinceP/tensorrt-cpp-for-onnx?tab=readme-ov-file#yolo-world
@PrinceP What is the processing speed and memory usage compared with the python version?
Thanks
@RoboEvangelist For onnx model: Average time per image: 0.3813 seconds
For trt model: Average time per image: 0.0804 seconds
Where can get a high performance version of the code, such the pytorch c++ version?
Thanks,