Open demuxin opened 1 week ago
Hi, have you recompiled your custom plugin with the appropriate headers and API definitions from TRT 10.x?
Also, FYI: https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#plugin-serialization
I have recompiled the plugin libraries using TensorRT 10.3 and still have this problem.
I checked the link you gave and didn't find any solution.
I noticed that the custom plugin interface for TensorRT10 changed to IPluginV3, my custom plugin uses IPluginV2DynamicExt,
Could this difference be causing the problem?
I found this quote in TensorRT Release Notes:
Deprecated IPluginV2DynamicExt, implement IPluginV3 instead. Refer to the Migrating V2 Plugins to IPluginV3 for how existing IPluginV2DynamicExt plugins can be migrated to IPluginV3.
Does this mean that custom plugins of IPluginV2DynamicExt cannot be used in TensorRT 10.x?
If it can be used in TensorRT 10.x, what should I do?
I found this quote in TensorRT Release Notes:
Deprecated IPluginV2DynamicExt, implement IPluginV3 instead. Refer to the Migrating V2 Plugins to IPluginV3 for how existing IPluginV2DynamicExt plugins can be migrated to IPluginV3.
Does this mean that custom plugins of IPluginV2DynamicExt cannot be used in TensorRT 10.x?
I believe that's correct. @samurdhikaru , please confirm, thanks.
Hi @moraxu @samurdhikaru , Is there any conclusion to? I've changed to IPluginV3 now and still have this problem.
Description
I customized a TensorRT plugin and compiled it into a library.
I load the plugin library using the following statement:
It works fine using TensorRT9.3, but when run the program with TensorRT10.3, it appears this error:
Is there a difference in the usage of loading dynamic libraries between TensorRT 9.3 and TensorRT 10.3?
Environment
TensorRT Version: tensorrt10.3
NVIDIA GPU: RTX 3090
NVIDIA Driver Version: 535.183.01
CUDA Version: 12.2
Operating System: ubuntu22.04