wang-xinyu / tensorrtx

Implementation of popular deep learning networks with TensorRT network definition API
MIT License
6.92k stars 1.77k forks source link

deepstream engine serializtion error #22

Closed sherlockking closed 4 years ago

sherlockking commented 4 years ago

I used the generated yolov4 engine for nvidia deepsteam application, but got a error:

ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: deserializationUtils.cpp (567) - Serialization Error in load: 0 (Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead.) ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: INVALID_STATE: std::exception ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: INVALID_CONFIG: Deserialize the cuda engine failed. ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:1452 Deserialize engine failed from file: /opt/nvidia/deepstream/deepstream-5.0/sources/YoloV4/yolov4.engine

Adding Factory Implement may be right. But I just want to kown If IPluginV2 is a good idea. By the way, I'm a beginner for tensorrt, so I want your help.

wang-xinyu commented 4 years ago

@sherlockking This repo for now is using TensorRT4 API, plugins were implemented using IPluginExt, not IPluginV2.

For this error, you may need to adapt the mish and yololayer plugin in yolov4 to IPluginV2.

sherlockking commented 4 years ago

@wang-xinyu I upgraded the mish, yolo and leaky relu to IPluginV2 and used IPluginCreator. Then I used the generated engine file in deepstream. Unfortunately, there still a problem:

ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: INVALID_ARGUMENT: getPluginCreator could not find plugin mish version 1 ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: safeDeserializationUtils.cpp (293) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: INVALID_STATE: std::exception ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: INVALID_CONFIG: Deserialize the cuda engine failed. ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:1452 Deserialize engine failed from file: /opt/nvidia/deepstream/deepstream-5.0/sources/YoloV4/yolov4.engine

Now I have no idea how to fix it. So I decide to give up this method and try to integrate your code into deepstream application.

bobbilichandu commented 3 years ago

@sherlockking Wereyou able to solve this issue? I am usnig a tensorrt engine file from tkdnn I am trying to deserialize it using sample python code and I faced similar issue. [TensorRT] ERROR: deserializationUtils.cpp (528) - Serialization Error in load: 0 (Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead Any help would be great! Thanks!

LiberiFatali commented 1 year ago

Running the application as

LD_PRELOAD=./libcustomOp.so deepstream-app -c <app-config>

from https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_FAQ.html#how-to-handle-operations-not-supported-by-triton-inference-server