Open milely opened 1 year ago
You log is incomplete, in the later part of the error log should tell you why the onnx parse fail.
You log is incomplete, in the later part of the error log should tell you why the onnx parse fail. Here is the complete log. It seems to be saying that the plugin for GridSample was not found. I saw on the official website that version 8.6 of TRT supports grid sample operation, but the conversion was not successful. Can you help answer what the problem might be?
[10/25/2023-21:30:50] [W] [TRT] onnx2trt_utils.cpp:369: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. [10/25/2023-21:30:50] [W] [TRT] onnx2trt_utils.cpp:395: One or more weights outside the range of INT32 was clamped [10/25/2023-21:30:50] [I] [TRT] No importer registered for op: GridSample. Attempting to import as plugin. [10/25/2023-21:30:50] [I] [TRT] Searching for plugin: GridSample, plugin_version: 1, plugin_namespace: [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:773: While parsing node number 114 [GridSample -> "/face_warp/GridSample_output_0"]: [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:774: --- Begin node --- [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:775: input: "/Transpose_1_output_0" input: "/face_warp/Concat_3_output_0" output: "/face_warp/GridSample_output_0" name: "/face_warp/GridSample" op_type: "GridSample" attribute { name: "align_corners" i: 0 type: INT } attribute { name: "mode" s: "bilinear" type: STRING } attribute { name: "padding_mode" s: "zeros" type: STRING }
[10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:776: --- End node --- [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4890 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?" [10/25/2023-21:30:50] [E] Failed to parse onnx file [10/25/2023-21:30:50] [I] Finished parsing network model. Parse time: 0.393404 [10/25/2023-21:30:50] [E] Parsing model failed [10/25/2023-21:30:50] [E] Failed to create engine from model or file. [10/25/2023-21:30:50] [E] Engine set up failed
Does the model work with onnxruntime? GridSample should be a supported op. see https://github.com/onnx/onnx-tensorrt/blob/8.6-GA/docs/operators.md
Please provide a reproduce if it work with onnxruntime.
Guess you are using model from mmlab. Try to convert with '--staticPlugins=xx/libmmdeploy_tensorrt_ops.so'.
I did not use the model in mmlab, but I used the bilinear_grid_sample operator implemented in mmcv to replace the grid sample operator and solved this problem.
Guess you are using model from mmlab. Try to convert with '--staticPlugins=xx/libmmdeploy_tensorrt_ops.so'.
hi,i use mmdeploy to convert model,but there is no this params? usage: deploy.py [-h] [--test-img TEST_IMG [TEST_IMG ...]] [--work-dir WORK_DIR] [--calib-dataset-cfg CALIB_DATASET_CFG] [--device DEVICE] [--log-level {CRITICAL,FATAL,ERROR,WARN,WARNING,INFO,DEBUG,NOTSET}] [--show] [--dump-info] [--quant-image-dir QUANT_IMAGE_DIR] [--quant] [--uri URI] deploy_cfg model_cfg checkpoint img deploy.py: error: unrecognized arguments: --static-plugins=/home/admin/miniconda3/lib/python3.10/site-packages/mmdeploy/lib/libmmdeploy_tensorrt_ops.so
Description
When I tried to convert the onnx model to a TRT model, I was using the TensorRT8.6 version, and the opset of onnx was 17. The operation of grid sample in the model encountered an error during conversion. ![image](https://github.com/NVIDIA/TensorRT/assets/19663249/c818b7ff-725b-47de-a8a8-78bde4df7214) ![image](https://github.com/NVIDIA/TensorRT/assets/19663249/50f29a4f-47b9-4adc-97b0-66512682c9a3) ![image](https://github.com/NVIDIA/TensorRT/assets/19663249/8d1a0c07-6a07-41b1-90ee-975d630c7560) Can anyone help answer this question? ## EnvironmentTensorRT Version: 8.6.1
NVIDIA GPU: 3090
NVIDIA Driver Version:
CUDA Version: 11.1
CUDNN Version:
Operating System:
Python Version (if applicable):
Tensorflow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if so, version):
Relevant Files
Model link:
Steps To Reproduce
Commands or scripts:
Have you tried the latest release?:
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (
polygraphy run <model.onnx> --onnxrt
):