NVIDIA-AI-IOT / yolo_deepstream

yolo model qat and deploy with deepstream&tensorrt
Apache License 2.0
550 stars 139 forks source link

The Python-deepStream API implements YOLOV4,Multichannel RTSP video stream inference cannot run #24

Open zhidk opened 2 years ago

zhidk commented 2 years ago

My computer Operating system:ubuntu20.04 CPU:i9 Intel(R) Core(TM) i9-12900KF GPU:NVIDIA 3090 24G video memory Deepstream6.1 was installed successfully. Python successfully bound DeepStream.

git clone https://github.com/NVIDIA-AI-IOT/yolov4_deepstream.git Successfully compiled, ready to run。 import to python interface, use deepstream-test3, run one RTSP stream can run, but run two RTSP appear. In the deepstream-test3 folder deepstream-test3.py file Modify < pgie.set_property('config-file-path', "config_infer_primary_yoloV4.txt" >

python3 deepstream-test3.py -i rtsp://admin:CDWAPM@192.168.1.9/h264/ch1/main/av_stream -s no problem

but python3 deepstream-test3.py -i rtsp://admin:CDWAPM@192.168.1.9/h264/ch1/main/av_stream rtsp://admin:FPTKSH@192.168.1.98/h264/ch1/main/av_stream -s
problem: 0:00:09.288897521 97120 0x29c0900 WARN nvinfer gstnvinfer.cpp:643:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvsInferContextImpl::checkBackendParams() [UID = 1]: Backend has maxBatchSize 1 whereas 2 has been requested 0:00:09.289061931 97120 0x29c0900 WARN nvinfer gstnvinfer.cpp:643:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvsInferContextImpl::generateBackendContext() [UID = 1]: deserialized backend context :/home/ai-box/deepstream/nvidia/deepstream/deepsream-6.1/sources/deepstream_yolov4/yolov4.engine failed to match config params, trying rebuild 0:00:09.297623267 97120 0x29c0900 INFO nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsIferContextImpl::buildModel() [UID = 1]: Trying to create engine from model files ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:860 failed to build network since there is no model file matched. ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:799 failed to build network. 0:00:09.801857495 97120 0x29c0900 ERROR nvinfer gstnvinfer.cpp:640:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInerContextImpl::buildModel() [UID = 1]: build engine file failed 0:00:09.851086491 97120 0x29c0900 ERROR nvinfer gstnvinfer.cpp:640:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInerContextImpl::generateBackendContext() [UID = 1]: build backend context failed 0:00:09.851121902 97120 0x29c0900 ERROR nvinfer gstnvinfer.cpp:640:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInerContextImpl::initialize() [UID = 1]: generate backend failed, check config file settings 0:00:09.851353192 97120 0x29c0900 WARN nvinfer gstnvinfer.cpp:846:gst_nvinfer_start: error: Failed to create NvDsInferContext nstance 0:00:09.851358835 97120 0x29c0900 WARN nvinfer gstnvinfer.cpp:846:gst_nvinfer_start: error: Config file path: config_infer_priary_yoloV4.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED

**PERF: {'stream0': 0.0, 'stream1': 0.0}

Error: gst-resource-error-quark: Failed to create NvDsInferContext instance (1): gstnvinfer.cpp(846): gst_nvinfer_start (): /GstPipeline:pipeline0/GstNvInfer:primary-nference: Config file path: config_infer_primary_yoloV4.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED Exiting app image

mchi-zg commented 2 years ago

The error is : ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:860 failed to build network since there is no model file matched.

so, you could check the model file in config file and the model file you put under the folder.

I just updated the DeepStream yolov4 folder, you may could check the latest code