Closed webrtcccccc closed 5 years ago
rm ensorRT/model/MobileNetSSD_deploy.caffemodel.1.tensorcache
Thank you for your reply, "rm TensorRT/model/MobileNetSSD_deploy.caffemodel.1.tensorcache" is pretty work.
I have been able to successfully execute on Jason Nano.
Thank you for your reply, "rm TensorRT/model/MobileNetSSD_deploy.caffemodel.1.tensorcache" is pretty work.
I have been able to successfully execute on Jason Nano.
Can you please share how was the speed (fps) ?
Thank you for your reply, "rm TensorRT/model/MobileNetSSD_deploy.caffemodel.1.tensorcache" is pretty work. I have been able to successfully execute on Jason Nano.
Can you please share how was the speed (fps) ?
About 10~20fps.
Thank you for your reply, "rm TensorRT/model/MobileNetSSD_deploy.caffemodel.1.tensorcache" is pretty work. I have been able to successfully execute on Jason Nano.
Can you please share how was the speed (fps) ?
About 10~20fps.
Thank you.
compile success,but error on run time like below output.
attempting to open cache file /home/ubuntuuser/MobileNet-SSD-TensorRT/model/MobileNetSSD_deploy.caffemodel.1.tensorcache loading network profile from cache... createInference The engine plan file is incompatible with this version of TensorRT, expecting 5.0.6.3got -2100382470.0.2.1852793695, please rebuild. createInference_end Bindings after deserializing: Segmentation fault (core dumped)