enazoe / yolo-tensorrt

TensorRT8.Support Yolov5n,s,m,l,x .darknet -> tensorrt. Yolov4 Yolov3 use raw darknet *.weights and *.cfg fils. If the wrapper is useful to you,please Star it.
MIT License
1.18k stars 316 forks source link

jetsonnano deepstream #168

Open ONNONS opened 2 years ago

ONNONS commented 2 years ago

I have successfully created the engine file i need help with the following error when trying to use deepstream in the jetsonnano environment

deepstream-app -c deepstream_app_config.txt

Using winsys: x11 ERROR: [TRT]: 3: getPluginCreator could not find plugin: DETECT_TRT version: 1.0 ERROR: [TRT]: 1: [pluginV2Runner.cpp::load::292] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) ERROR: [TRT]: 4: [runtime.cpp::deserializeCudaEngine::76] Error Code 4: Internal Error (Engine deserialization failed.) ERROR: Deserialize engine failed from file: /home/iisl/Desktop/deepstream-6.0/sources/wf6_trt_engine/yolov5.engine 0:00:02.010330404 16896 0xc73e240 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() [UID = 1]: deserialize engine from file :/home/iisl/Desktop/deepstream-6.0/sources/wf6_trt_engine/yolov5.engine failed 0:00:02.010527907 16896 0xc73e240 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() [UID = 1]: deserialize backend context from engine from file :/home/iisl/Desktop/deepstream-6.0/sources/wf6_trt_engine/yolov5.engine failed, try rebuild 0:00:02.010574939 16896 0xc73e240 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() [UID = 1]: Trying to create engine from model files ERROR: failed to build network since there is no model file matched. ERROR: failed to build network. 0:00:02.012996796 16896 0xc73e240 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() [UID = 1]: build engine file failed 0:00:02.013085183 16896 0xc73e240 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() [UID = 1]: build backend context failed 0:00:02.013122475 16896 0xc73e240 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() [UID = 1]: generate backend failed, check config file settings 0:00:02.013796859 16896 0xc73e240 WARN nvinfer gstnvinfer.cpp:841:gst_nvinfer_start: error: Failed to create NvDsInferContext instance 0:00:02.013855350 16896 0xc73e240 WARN nvinfer gstnvinfer.cpp:841:gst_nvinfer_start: error: Config file path: /home/iisl/Desktop/deepstream-6.0/sources/wf6_trt_engine/config_infer_primary.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED ** ERROR: : Failed to set pipeline to PAUSED Quitting ERROR from primary_gie: Failed to create NvDsInferContext instance Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(841): gst_nvinfer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie: Config file path: /home/iisl/Desktop/deepstream-6.0/sources/wf6_trt_engine/config_infer_primary.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED App run failed