Closed jefflgaol closed 4 years ago
The repo was built based on TensorRT v4.0.1.6 at first which may not suitable for the newer python version .Maybe you need to migrate the code to the TensorRT v5 or later to meet the IPluginV2.
I tried to use the engine produced by your C++ code and called it via Python.
Below is the code that I used:
engine_file_path = '/home/cwlab/ALPR/TensorRT-Yolov3-master/yolov3_int8.engine' TRT_LOGGER = trt.Logger() def get_engine(path): with open(path, "rb") as f, trt.Runtime(TRT_LOGGER) as runtime: return runtime.deserialize_cuda_engine(serialized_engine=f.read()) def main(): print("Reading engine from file {}".format(engine_file_path)) engine = get_engine(engine_file_path) if __name__ == '__main__': main()
But, I get this following error: [TensorRT] ERROR: Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead.
Do you mind sharing what should I do? Or I simply can't import the engine without the shared library, because I can't find one. Thank you.
hello, have u solve the problem
Hi @jefflgaol Have U solve the problem ? We are facing with it.
I haven't solved it yet. So, I trained my own model using Darknet and create my own engine. The tutorial was provided very clear in NVIDIA Documentation.
@jefflgaol Have you solved this? Can you share the link to tutorial please. Thanks!
I tried to use the engine produced by your C++ code and called it via Python.
Below is the code that I used:
But, I get this following error: [TensorRT] ERROR: Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead.
Do you mind sharing what should I do? Or I simply can't import the engine without the shared library, because I can't find one. Thank you.