I can convert to trt and run the two exported onnx models provided.
However, when I try to export from the original repository, I get no luck.
I exported resnet50 and shufflenetv216 to onnx, both of the onnx models work well with onnxruntime
Later on I convert them to trt by onnx2trt, but the following error occurs when running the converted trt engine:
[TensorRT] ERROR: ../rtExt/cuda/pointwiseV2Helpers.h (538) - Cuda Error in launchPwgenKernel: 400 (invalid resource handle)
[TensorRT] ERROR: FAILED_EXECUTION: std::exception
I can convert to trt and run the two exported onnx models provided. However, when I try to export from the original repository, I get no luck. I exported resnet50 and shufflenetv216 to onnx, both of the onnx models work well with onnxruntime Later on I convert them to trt by onnx2trt, but the following error occurs when running the converted trt engine:
[TensorRT] ERROR: ../rtExt/cuda/pointwiseV2Helpers.h (538) - Cuda Error in launchPwgenKernel: 400 (invalid resource handle) [TensorRT] ERROR: FAILED_EXECUTION: std::exception
May anyone has any ideas? Thank you very much.