Open lgy2830 opened 1 year ago
我是在安装完环境以后,把模型导出为onnx模型,然后再导出为engine模型时出现上述错误
https://github.com/PaddlePaddle/PaddleDetection/tree/develop/deploy/end2end_ppyoloe 可以参考下这个文档,需要在导出的时候手动加下nms的算子
我的步骤是按照readme里边写的, 1. python tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_l_300e_coco.yml -o weights=https://paddledet.bj.bcebos.com/models/ppyoloe_crn_l_300e_coco.pdparams trt=True exclude_nms=True 2. python deploy/third_engine/demo_onnx_trt/onnx_custom.py --onnx_file=output_inference/ppyoloe_crn_l_300e_coco/ppyoloe_crn_l_300e_coco.onnx --model_dir=output_inference/ppyoloe_crn_l_300e_coco/ --opset_version=11 3. trtexec --onnx=output_inference/ppyoloe_crn_l_300e_coco/ppyoloe_crn_l_300e_coco.onnx --saveEngine=ppyoloe_crn_l_300e_coco.engine 4. 在 trtexec --onnx=output_inference/ppyoloe_crn_l_300e_coco/ppyoloe_crn_l_300e_coco.onnx --saveEngine=ppyoloe_crn_l_300e_coco.engine 时。 报错 [11/23/2022-16:16:28] [W] [TRT] onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. [11/23/2022-16:16:28] [I] [TRT] ModelImporter.cpp:135: No importer registered for op: EfficientNMS_TRT. Attempting to import as plugin. [11/23/2022-16:16:28] [I] [TRT] builtin_op_importers.cpp:3771: Searching for plugin: EfficientNMS_TRT, plugin_version: 1, plugin_namespace: [11/23/2022-16:16:28] [E] [TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin EfficientNMS_TRT version 1 ERROR: builtin_op_importers.cpp:3773 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?" [11/23/2022-16:16:28] [E] Failed to parse onnx file [11/23/2022-16:16:28] [E] Parsing model failed [11/23/2022-16:16:28] [E] Engine creation failed [11/23/2022-16:16:28] [E] Engine set up failed &&&& FAILED TensorRT.trtexec # trtexec --onnx=output_inference/ppyoloe_crn_l_300e_coco/ppyoloe_crn_l_300e_coco.onnx --saveEngine=ppyoloe_crn_l_300e_coco.engine
这是tensorrt版本不对,用EfficientNMS_TRT插件需要8.0以上的,
问题确认 Search before asking
请提出你的问题 Please ask your question
在 trtexec --onnx=output_inference/ppyoloe_crn_l_300e_coco/ppyoloe_crn_l_300e_coco.onnx --saveEngine=ppyoloe_crn_l_300e_coco.engine 时。 报错 [11/23/2022-16:16:28] [W] [TRT] onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. [11/23/2022-16:16:28] [I] [TRT] ModelImporter.cpp:135: No importer registered for op: EfficientNMS_TRT. Attempting to import as plugin. [11/23/2022-16:16:28] [I] [TRT] builtin_op_importers.cpp:3771: Searching for plugin: EfficientNMS_TRT, plugin_version: 1, plugin_namespace: [11/23/2022-16:16:28] [E] [TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin EfficientNMS_TRT version 1 ERROR: builtin_op_importers.cpp:3773 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?" [11/23/2022-16:16:28] [E] Failed to parse onnx file [11/23/2022-16:16:28] [E] Parsing model failed [11/23/2022-16:16:28] [E] Engine creation failed [11/23/2022-16:16:28] [E] Engine set up failed &&&& FAILED TensorRT.trtexec # trtexec --onnx=output_inference/ppyoloe_crn_l_300e_coco/ppyoloe_crn_l_300e_coco.onnx --saveEngine=ppyoloe_crn_l_300e_coco.engine