This is a sample application for counting people entering/leaving in a building using NVIDIA Deepstream SDK, Transfer Learning Toolkit (TLT), and pre-trained models. This application can be used to build real-time occupancy analytics applications for smart buildings, hospitals, retail, etc. The application is based on deepstream-test5 sample application.
The download link for the model is not working. I downloaded another model from that page and it doesn't seem to work.
https://catalog.ngc.nvidia.com/models?filters=&orderBy=dateModifiedDESC&query=people
I get resnet34_peoplenet_int8.etlt and resnet34_peoplenet_int8.txt, then mov resnet34_peoplenet_int8.txt to /deepstream-occupancy-analytics/config/peoplenet and rename labels.txt.
When I run , it occurs error:
nvidia@nano:~/ds6_test/deepstream-occupancy-analytics$ ./deepstream-test5-analytics -c config/test5_config_file_src_infer_tlt.txt
** WARN: : Deprecated config 'smart-rec-video-cache' used in group [source0]. Use 'smart-rec-cache' instead
(deepstream-test5-analytics:13446): GLib-CRITICAL **: 18:33:43.568: g_strchomp: assertion 'string != NULL' failed
Warning: 'input-dims' parameter has been deprecated. Use 'infer-dims' instead.
Using winsys: x11
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-6.0/lib/libnvds_nvmultiobjecttracker.so
~~ CLOG[/dvs/git/dirty/git-master_linux/deepstream/sdk/src/utils/nvmultiobjecttracker/include/modules/NvMultiObjectTracker/NvTrackerParams.hpp, getConfigRoot() @line 54]: [NvTrackerParams::getConfigRoot()] !!![WARNING] Invalid low-level config file caused an exception, but will go ahead with the default config values
gstnvtracker: Batch processing is ON
gstnvtracker: Past frame output is OFF
~~ CLOG[/dvs/git/dirty/git-master_linux/deepstream/sdk/src/utils/nvmultiobjecttracker/include/modules/NvMultiObjectTracker/NvTrackerParams.hpp, getConfigRoot() @line 54]: [NvTrackerParams::getConfigRoot()] !!![WARNING] Invalid low-level config file caused an exception, but will go ahead with the default config values
[NvMultiObjectTracker] Initialized
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_int8.engine open error
0:00:01.582629551 13446 0x559ee32070 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_int8.engine failed
0:00:01.582765491 13446 0x559ee32070 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_int8.engine failed, try rebuild
0:00:01.582793356 13446 0x559ee32070 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() [UID = 1]: Trying to create engine from model files
WARNING: INT8 not supported by platform. Trying FP16 mode.
NvDsInferCudaEngineGetFromTltModel: Failed to open TLT encoded model file /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:01.583686186 13446 0x559ee32070 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() [UID = 1]: build engine file failed
terminate called after throwing an instance of 'nvinfer1::InternalError'
what(): Assertion mRefCount > 0 failed.
Aborted (core dumped)
nvidia@nano:~/ds6_test/deepstream-occupancy-analytics$
I think the problem is caused by the model, the model to be downloaded from where can still be?
The download link for the model is not working. I downloaded another model from that page and it doesn't seem to work. https://catalog.ngc.nvidia.com/models?filters=&orderBy=dateModifiedDESC&query=people I get resnet34_peoplenet_int8.etlt and resnet34_peoplenet_int8.txt, then mov resnet34_peoplenet_int8.txt to /deepstream-occupancy-analytics/config/peoplenet and rename labels.txt. When I run , it occurs error: nvidia@nano:~/ds6_test/deepstream-occupancy-analytics$ ./deepstream-test5-analytics -c config/test5_config_file_src_infer_tlt.txt ** WARN:: Deprecated config 'smart-rec-video-cache' used in group [source0]. Use 'smart-rec-cache' instead
(deepstream-test5-analytics:13446): GLib-CRITICAL **: 18:33:43.568: g_strchug: assertion 'string != NULL' failed
(deepstream-test5-analytics:13446): GLib-CRITICAL **: 18:33:43.568: g_strchomp: assertion 'string != NULL' failed Warning: 'input-dims' parameter has been deprecated. Use 'infer-dims' instead.
Using winsys: x11 gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-6.0/lib/libnvds_nvmultiobjecttracker.so ~~ CLOG[/dvs/git/dirty/git-master_linux/deepstream/sdk/src/utils/nvmultiobjecttracker/include/modules/NvMultiObjectTracker/NvTrackerParams.hpp, getConfigRoot() @line 54]: [NvTrackerParams::getConfigRoot()] !!![WARNING] Invalid low-level config file caused an exception, but will go ahead with the default config values gstnvtracker: Batch processing is ON gstnvtracker: Past frame output is OFF ~~ CLOG[/dvs/git/dirty/git-master_linux/deepstream/sdk/src/utils/nvmultiobjecttracker/include/modules/NvMultiObjectTracker/NvTrackerParams.hpp, getConfigRoot() @line 54]: [NvTrackerParams::getConfigRoot()] !!![WARNING] Invalid low-level config file caused an exception, but will go ahead with the default config values [NvMultiObjectTracker] Initialized ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_int8.engine open error 0:00:01.582629551 13446 0x559ee32070 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_int8.engine failed
0:00:01.582765491 13446 0x559ee32070 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_int8.engine failed, try rebuild
0:00:01.582793356 13446 0x559ee32070 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() [UID = 1]: Trying to create engine from model files
WARNING: INT8 not supported by platform. Trying FP16 mode.
NvDsInferCudaEngineGetFromTltModel: Failed to open TLT encoded model file /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-occupancy-analytics/config/peoplenet/resnet34_peoplenet_pruned.etlt
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:01.583686186 13446 0x559ee32070 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() [UID = 1]: build engine file failed
terminate called after throwing an instance of 'nvinfer1::InternalError'
what(): Assertion mRefCount > 0 failed.
Aborted (core dumped)
nvidia@nano:~/ds6_test/deepstream-occupancy-analytics$
I think the problem is caused by the model, the model to be downloaded from where can still be?