nwesem / mtcnn_facenet_cpp_tensorRT

Face Recognition on NVIDIA Jetson (Nano) using TensorRT
GNU General Public License v3.0
200 stars 72 forks source link

Error loading model with python #6

Open elisabetta496 opened 4 years ago

elisabetta496 commented 4 years ago

Hi, I tested your C++ implementation and I would like to implement it in Python. I'm trying to load the engine file but the problem is that the plugin is not found when loading the engine with:

with open(engineFile, "rb") as f, trt.Runtime(G_LOGGER) as runtime: engine = runtime.deserialize_cuda_engine(f.read())

I get the following error:

[TensorRT] ERROR: INVALID_ARGUMENT: getPluginCreator could not find plugin L2Norm_Helper_TRT version 1 [TensorRT] ERROR: safeDeserializationUtils.cpp (259) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) [TensorRT] ERROR: INVALID_STATE: std::exception [TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.

Do you know how could I solve this issue? Thanks

nwesem commented 4 years ago

hi, i'm happy to hear that. As stated in the README, this is a custom plugin written by https://github.com/r7vme/tensorrt_l2norm_helper. Please check how to load custom TensorRT plugins in Python and let me know if you need more help.

deaffella commented 4 years ago

Hi, I tested your C++ implementation and I would like to implement it in Python. I'm trying to load the engine file but the problem is that the plugin is not found when loading the engine with:

with open(engineFile, "rb") as f, trt.Runtime(G_LOGGER) as runtime: engine = runtime.deserialize_cuda_engine(f.read())

I get the following error:

[TensorRT] ERROR: INVALID_ARGUMENT: getPluginCreator could not find plugin L2Norm_Helper_TRT version 1 [TensorRT] ERROR: safeDeserializationUtils.cpp (259) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) [TensorRT] ERROR: INVALID_STATE: std::exception [TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.

Do you know how could I solve this issue? Thanks

Hi! I want to use uff models in my python script, but I don't know how to load them correctly. Have you solved the problem of loading models in python?

nwesem commented 4 years ago

does this part of TensorRT documentation help you?

deaffella commented 4 years ago

does this part of TensorRT documentation help you?

No, it does not. I don't see any instructions to load .uff and .engine models in this documentation.

elisabetta496 commented 4 years ago

No, i'm sorry! I've loaded the engine file so far. Have you solved the problem? Otherwise I will let you know if I will be able.

deaffella commented 4 years ago

No, i'm sorry! I've loaded the engine file so far. Have you solved the problem? Otherwise I will let you know if I will be able.

Unfortunately I couldn't find any example of loading models in Python. Please help me

do-van-long commented 4 years ago

Hi @elisabetta496! I have the same problem as @deaffella . Could you please share with us how to convert and save facenet.uff to facenet.engine using Python? Many thanks.

deaffella commented 3 years ago

does this part of TensorRT documentation help you?

hi! I'm still trying to open optimized models in Python. could you help me with advice on how to create a. so file for the plugin?

nwesem commented 3 years ago

I'm not sure if this is the way it works. Are you @deaffella ? If so, I could help you compile the project as a dynamic or static library.

AnasMK commented 3 years ago

First note this quote from the official TensorRT Release Notes:

Deprecation of Caffe Parser and UFF Parser - We are deprecating Caffe Parser and UFF Parser in TensorRT 7. They will be tested and functional in the next major release of TensorRT 8, but we plan to remove the support in the subsequent major release. Plan to migrate your workflow to use tf2onnx, keras2onnx or TensorFlow-TensorRT (TF-TRT) for deployment.

I have successfully converted Facenet to TRT engine using ONNX. And used it with Python

I downloaded facenet_keras.h5 provided in this tutorial. Then I convert it to TRT engine with ONNX using this Python tutorial provided by Nvidia

AnasMK commented 3 years ago

Check this repo riotu-lab/tf2trt_with_onnx to covert Facenet model to TensorRT engine and use it with Python