onnx / onnx-tensorrt

ONNX-TensorRT: TensorRT backend for ONNX
Apache License 2.0
2.94k stars 544 forks source link

Conversion fails for negative pad values #847

Open katrasnikj opened 2 years ago

katrasnikj commented 2 years ago

Description

If ONNX model contains a Pad layer with negative pad values the conversion to TensorRT fails. Error:

[05/30/2022-09:26:47] [E] [TRT] ModelImporter.cpp:748: While parsing node number 0 [Pad -> "182"]:
[05/30/2022-09:26:47] [E] [TRT] ModelImporter.cpp:749: --- Begin node ---
[05/30/2022-09:26:47] [E] [TRT] ModelImporter.cpp:750: input: "input"
input: "179"
input: "181"
output: "182"
name: "Pad_23"
op_type: "Pad"
attribute {
  name: "mode"
  s: "constant"
  type: STRING
}

[05/30/2022-09:26:47] [E] [TRT] ModelImporter.cpp:751: --- End node ---
[05/30/2022-09:26:47] [E] [TRT] ModelImporter.cpp:754: ERROR: builtin_op_importers.cpp:3104 In function importPad:
[8] Assertion failed: convertOnnxPadding(ctx, nbDims, onnxPadding, start, totalPadding) && "Failed to convert padding!"
[05/30/2022-09:26:47] [E] Failed to parse onnx file

Environment

TensorRT Version: 8.4.0 ONNX-TensorRT Version / Branch: / GPU Type: Geforce RTX 3080 Nvidia Driver Version: 512.77 CUDA Version: 11.6 CUDNN Version: 8.3 Operating System + Version: Windows 10 Python Version (if applicable): TensorFlow + TF2ONNX Version (if applicable): PyTorch Version (if applicable): 1.10.1 Baremetal or Container (if container which image + tag):

Relevant Files

Steps To Reproduce

Unzip model.zip and run trtexec.exe --onnx=model.onnx

zerollzeng commented 2 years ago

can you share the model here?

katrasnikj commented 2 years ago

Here is a zipped example model

model.zip

donrax commented 2 years ago

Note, negative pad conversion was working with TensorRT Version: 8.0.3.4

zerollzeng commented 2 years ago

@kevinch-nv We don't support negative pad value in parser? we parse the pad operator as ISliceLayer right?

donrax commented 2 years ago

Any updates on the issue?

donrax commented 2 years ago

The issue seems to be resolved in TensorRT Version: 8.4.1.5 (also TensorRT Version: 8.4.0.6) Correction. The issue is not resolved.

Markovvn1w commented 4 months ago

Any updates on the issue? The problem is still reproducible in ONNX 1.18.0 with TensorRT 10.0