NVIDIA / TensorRT

NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
https://developer.nvidia.com/tensorrt
Apache License 2.0
10.15k stars 2.08k forks source link

TensorRT cannot support Add, creator: Plugin not found, are the plugin name, version, and namespace correct? #3935

Open demuxin opened 3 weeks ago

demuxin commented 3 weeks ago

Description

I tried to convert my onnx model into trt engine, and I got this error:

[06/11/2024-07:46:17] [E] [TRT] ModelImporter.cpp:882: While parsing node number 23810 [Add -> "boxes"]:
[06/11/2024-07:46:17] [E] [TRT] ModelImporter.cpp:883: --- Begin node ---
input: "/bbox_embed.5/layers.2/Add_output_0"
input: "/Log_output_0"
output: "boxes"
name: "/Add"
op_type: "Add"
doc_string: "/workspace/deta_export/models/backbone.py(241): forward\n/home/osmagic/.local/lib/python3.9/site-packages/torch/nn/modules/module.py(1501): _slow_forward\n/home/osmagic/.local/lib/python3.9/site-packages/torch/nn/modules/module.py(1520): _call_impl\n/home/osmagic/.local/lib/python3.9/site-packages/torch/nn/modules/module.py(1511): _wrapped_call_impl\n/home/osmagic/.local/lib/python3.9/site-packages/torch/jit/_trace.py(129): wrapper\n/home/osmagic/.local/lib/python3.9/site-packages/torch/jit/_trace.py(138): forward\n/home/osmagic/.local/lib/python3.9/site-packages/torch/nn/modules/module.py(1520): _call_impl\n/home/osmagic/.local/lib/python3.9/site-packages/torch/nn/modules/module.py(1511): _wrapped_call_impl\n/home/osmagic/.local/lib/python3.9/site-packages/torch/jit/_trace.py(1296): _get_trace_graph\n/home/osmagic/.local/lib/python3.9/site-packages/torch/onnx/utils.py(918): _trace_and_get_graph_from_model\n/home/osmagic/.local/lib/python3.9/site-packages/torch/onnx/utils.py(1014): _create_jit_graph\n/home/osmagic/.local/lib/python3.9/site-packages/torch/onnx/utils.py(1139): _model_to_graph\n/home/osmagic/.local/lib/python3.9/site-packages/torch/onnx/utils.py(1618): _export\n/home/osmagic/.local/lib/python3.9/site-packages/torch/onnx/utils.py(516): export\n/workspace/deta_export/test.py(31): save_onnx\n/workspace/deta_export/test.py(116): <module>\n"

[06/11/2024-07:46:17] [E] [TRT] ModelImporter.cpp:884: --- End node ---
[06/11/2024-07:46:17] [E] [TRT] ModelImporter.cpp:887: ERROR: builtin_op_importers.cpp:5883 In function importFallbackPluginImporter:
[8] Assertion failed: creator: Plugin not found, are the plugin name, version, and namespace correct?
[06/11/2024-07:46:17] [E] Failed to parse onnx file
[06/11/2024-07:46:18] [I] Finished parsing network model. Parse time: 20.6617
[06/11/2024-07:46:18] [E] Parsing model failed
[06/11/2024-07:46:18] [E] Failed to create engine from model or file.
[06/11/2024-07:46:18] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v9300] # /usr/local/tensorrt/TensorRT-9.3.0.1/targets/x86_64-linux-gnu/bin/trtexec --minShapes=images:1x3x832x1440 --optShapes=images:1x3x832x1440 --maxShapes=images:1x3x832x1440 --onnx=weights/out_nms_0611.onnx --saveEngine=weights/swin.trt --fp16 --verbose

This is the location of the code that prompts the error report: https://github.com/jozhang97/DETA/blob/f47e50efeb194357fb93a119f196a2e485fffdb5/models/deformable_detr.py#L172

I think that Add should be a basic operator in tensorRT, but it can't be found. It's very weird.

I can give you the onnx file I am using.

this is my convert command:

/usr/local/tensorrt/TensorRT-9.3.0.1/targets/x86_64-linux-gnu/bin/trtexec \
--minShapes=images:1x3x832x1440 --optShapes=images:1x3x832x1440 --maxShapes=images:1x3x832x1440 \
--onnx=weights/out_nms_0611.onnx --saveEngine=weights/swin.trt --fp16 --verbose

Environment

TensorRT Version: 9.3.0.1

NVIDIA GPU: GeForce RTX 3090

NVIDIA Driver Version: 535.161.08

CUDA Version: 12.2

Operating System: ubuntu22.04

Python Version (if applicable): 3.9.7

PyTorch Version (if applicable): 2.2.1

lix19937 commented 2 weeks ago

You can upload the related onnx.