NVIDIA / TensorRT

NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
https://developer.nvidia.com/tensorrt
Apache License 2.0
10.56k stars 2.1k forks source link

There is no acceleration for crnn model? #446

Closed Hubert2102 closed 3 years ago

Hubert2102 commented 4 years ago

Description

I trained a image-based sequence recognition model called CRNN in pytorch.The network contain some conv and BiLSTM modules.
I exported my model to onnx and trt. For a input such as (1,1,32,320) ,it costs 6ms with trt to do inference,while the pytorch inference time is 5ms.

Environment

TensorRT Version: TensorRT-7.0.0.11 GPU Type: Tesla P40 Nvidia Driver Version: 440.33.01 CUDA Version: 10.2 CUDNN Version: 7.6.5 Operating System + Version: Centos Python Version (if applicable): 3.7.4 PyTorch Version (if applicable): 1.2

ZimingLu commented 4 years ago

Description

I trained a image-based sequence recognition model called CRNN in pytorch.The network contain some conv and BiLSTM modules. I exported my model to onnx and trt. For a input such as (1,1,32,320) ,it costs 6ms with trt to do inference,while the pytorch inference time is 5ms.

Environment

TensorRT Version: TensorRT-7.0.0.11 GPU Type: Tesla P40 Nvidia Driver Version: 440.33.01 CUDA Version: 10.2 CUDNN Version: 7.6.5 Operating System + Version: Centos Python Version (if applicable): 3.7.4 PyTorch Version (if applicable): 1.2

same problem,I profile the cnn and rnn part, and I found rnn is 3 times faster but cnn is too slow!

ZimingLu commented 4 years ago

Description

I trained a image-based sequence recognition model called CRNN in pytorch.The network contain some conv and BiLSTM modules. I exported my model to onnx and trt. For a input such as (1,1,32,320) ,it costs 6ms with trt to do inference,while the pytorch inference time is 5ms.

Environment

TensorRT Version: TensorRT-7.0.0.11 GPU Type: Tesla P40 Nvidia Driver Version: 440.33.01 CUDA Version: 10.2 CUDNN Version: 7.6.5 Operating System + Version: Centos Python Version (if applicable): 3.7.4 PyTorch Version (if applicable): 1.2

I guess we could talk about this problem in detail, my email address is gtayxlyc@163.com.

Hubert2102 commented 4 years ago

I export the pytorch model firstly: torch.onnx.export(model,dummy_input,"super_crnn.onnx",export_params=True,verbose=True, input_names = ['input'],output_names = ['output'],dynamic_axes={'input' : {3 : 'w'},'output' : {3 : 'w'}}) and then my trt script is dynam_test_crnn.py.txt the output is as below:

[TensorRT] WARNING: Current optimization profile is: 0. Please ensure there are no enqueued operations pending in this context prior to switching profiles (1, 1, 32, 320) 10240 [TensorRT] WARNING: Explicit batch network detected and batch size specified, use enqueue without batch size instead. [TensorRT] WARNING: Explicit batch network detected and batch size specified, use enqueue without batch size instead. [TensorRT] WARNING: Explicit batch network detected and batch size specified, use enqueue without batch size instead. ...

YIYANGCAI commented 4 years ago

Problem

Hello @Hubert2102, I am also arranging a CRNN model from PyTorch to TensorRT via ONNX, and I have got the .onnx model successfully. However, when I try to convert the onnx to trt, I met the failure. I also referred your code of producing the engine, but got these errors:

Traceback (most recent call last):
  File "test.py", line 105, in <module>
    build_engine('./data/crnn_data/model1016.onnx')
  File "test.py", line 32, in build_engine
    profile.set_shape(network.get_input(0).name,(1,1,32,320))
TypeError: set_shape(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.IOptimizationProfile, input: str, min: tensorrt.tensorrt.Dims, opt: tensorrt.tensorrt.Dims, max: tensorrt.tensorrt.Dims) -> None

Invoked with: <tensorrt.tensorrt.IOptimizationProfile object at 0x7fa4c77d5490>, 'input.1', (1, 1, 32, 320)

Since my onnx file have the same output with the original pth model, I think my onnx file is correct, could you help me to find out the problem? Many thanks.

By the way, when I construct the model of crnn from onnx to trt, the Slice operator met the following error:

ERROR: (Unnamed Layer* 316) [Slice]: slice size must be positive, size = [0,0,0]

If you would like to discuss the problem, my email is caiyiyang980311@icloud.com

YIYANGCAI commented 4 years ago

Problem

Hello @Hubert2102, I am also arranging a CRNN model from PyTorch to TensorRT via ONNX, and I have got the .onnx model successfully. However, when I try to convert the onnx to trt, I met the failure. I also referred your code of producing the engine, but got these errors:

Traceback (most recent call last):
  File "test.py", line 105, in <module>
    build_engine('./data/crnn_data/model1016.onnx')
  File "test.py", line 32, in build_engine
    profile.set_shape(network.get_input(0).name,(1,1,32,320))
TypeError: set_shape(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.IOptimizationProfile, input: str, min: tensorrt.tensorrt.Dims, opt: tensorrt.tensorrt.Dims, max: tensorrt.tensorrt.Dims) -> None

Invoked with: <tensorrt.tensorrt.IOptimizationProfile object at 0x7fa4c77d5490>, 'input.1', (1, 1, 32, 320)

Since my onnx file have the same output with the original pth model, I think my onnx file is correct, could you help me to find out the problem? Many thanks.

By the way, when I construct the model of crnn from onnx to trt, the Slice operator met the following error:

ERROR: (Unnamed Layer* 316) [Slice]: slice size must be positive, size = [0,0,0]

If you would like to discuss the problem, my email is caiyiyang980311@icloud.com

anyway, I have solved this problem, many thanks for this issue!

wyp19960713 commented 4 years ago

@YIYANGCAI 您好,我这边也遇到了同样的切片异常问题,您能方便告诉下是怎么解决的吗?

[TensorRT] ERROR: (Unnamed Layer* 238) [Slice]: slice size must be positive, size = [0,0,0]
[TensorRT] ERROR: (Unnamed Layer* 239) [Slice]: slice size must be positive, size = [0,0,0]
kevinch-nv commented 4 years ago

@Hubert2102 @ZimingLu are you still having trouble with the performance of your model? Have you tried benchmarking with TensorRT 7.1?

quietsmile commented 4 years ago

@YIYANGCAI 您好,我这边也遇到了同样的切片异常问题,您能方便告诉下是怎么解决的吗?

[TensorRT] ERROR: (Unnamed Layer* 238) [Slice]: slice size must be positive, size = [0,0,0]
[TensorRT] ERROR: (Unnamed Layer* 239) [Slice]: slice size must be positive, size = [0,0,0]

请问你的问题解决了么?

quietsmile commented 4 years ago

Problem

Hello @Hubert2102, I am also arranging a CRNN model from PyTorch to TensorRT via ONNX, and I have got the .onnx model successfully. However, when I try to convert the onnx to trt, I met the failure. I also referred your code of producing the engine, but got these errors:

Traceback (most recent call last):
  File "test.py", line 105, in <module>
    build_engine('./data/crnn_data/model1016.onnx')
  File "test.py", line 32, in build_engine
    profile.set_shape(network.get_input(0).name,(1,1,32,320))
TypeError: set_shape(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.IOptimizationProfile, input: str, min: tensorrt.tensorrt.Dims, opt: tensorrt.tensorrt.Dims, max: tensorrt.tensorrt.Dims) -> None

Invoked with: <tensorrt.tensorrt.IOptimizationProfile object at 0x7fa4c77d5490>, 'input.1', (1, 1, 32, 320)

Since my onnx file have the same output with the original pth model, I think my onnx file is correct, could you help me to find out the problem? Many thanks. By the way, when I construct the model of crnn from onnx to trt, the Slice operator met the following error:

ERROR: (Unnamed Layer* 316) [Slice]: slice size must be positive, size = [0,0,0]

If you would like to discuss the problem, my email is caiyiyang980311@icloud.com

anyway, I have solved this problem, many thanks for this issue!

请问,你这边实验的加速比如何?谢谢。

ttyio commented 3 years ago

Hello @Hubert2102 @ZimingLu , have you got any chance to try TRT 7.2, and how is the performance? thanks!

ttyio commented 3 years ago

Closing since no activity for more than 3 weeks, please reopen if you still have question, thanks!

jinec commented 1 year ago

您好,请问您这个问题解决的怎么样了呀

jinec commented 1 year ago

@Hubert2102 @ZimingLu 请问,你们这个问题解决的怎么样了呀