Closed ylab604 closed 3 months ago
I changed rapidflow_it12 to onnx as per the readme and changed it back to TensorRT. It has been confirmed that onnx inference works, but the result of TensorRT only comes out as a constant.
rapidflow_it12.ckpt ==> onnx (O) onnx ==> TensorRT (X)
i have a Question is your Tensorrt ver : 8.6?
Hello, thank you for your report.
I pushed a new update with some modifications to the rapidflow code and a simple script to compile to TensorRT.
Please check rapidflow's readme for some instructions. I tested it on my computer, not on a Jetson. It seems to work on the sample images, but I did not test it very extensively. If there is still some problem, please let me know.
The versions in my test were: torch 2.2.2+cu121 torch_tensorrt 2.2.0 tensorrt 8.6.1
Best regards
thank you for kindness i will check and write the result soon
it works at orin nx jetpack 6.0 onnx opset 17 tesnsorrt 8.6.2 thank you alot!! best regards
I'll say thank you when I see you at the next conference. Let's have a meal that time.
Glad to know it is working! Yes, looking forward for our next meeting.
Hello! Thank you for great work about rapidflow ICRA2024 I remember what you explained in the poster. And write this email while testing rapidflow. I am currently changing rapidflow to onnx and changing it to TensorRT and experimenting with orin nx. But it doesn't seem to be robustly changed to TensorRT. The output value is only a numpy array of constant values. Do you know anything about this? How did you experiment with converting to TensorRT?
Once again, thank you so much for revealing such good research.