NVIDIA-AI-IOT / deepstream_tao_apps

Sample apps to demonstrate how to deploy models trained with TAO on DeepStream
MIT License
369 stars 95 forks source link

Unsupported operation _MultilevelProposeROI_TRT on Jetson Xavier #29

Open tlalexander opened 3 years ago

tlalexander commented 3 years ago

Hello. I have modified the jetson deepstream_app_source1_mrcnn.txt config file to use the mask_rcnn_resnet50.etlt file from the models linked in this repo's README. But when I try to run deepstream against that model, I get the error:

ERROR: [TRT]: UffParser: Validator error: multilevel_propose_rois: Unsupported operation _MultilevelProposeROI_TRT

the full output is:

deepstream-app -c deepstream_app_source1_mrcnn.txt 

Using winsys: x11 
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/samples/configs/tlt_pretrained_models/../../models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine open error
0:00:01.157964011 21695   0x556aa94a30 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/samples/configs/tlt_pretrained_models/../../models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine failed
0:00:01.158108114 21695   0x556aa94a30 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/samples/configs/tlt_pretrained_models/../../models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine failed, try rebuild
0:00:01.158157396 21695   0x556aa94a30 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
ERROR: [TRT]: UffParser: Validator error: multilevel_propose_rois: Unsupported operation _MultilevelProposeROI_TRT
parseModel: Failed to parse UFF model
ERROR: failed to build network since parsing model errors.
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:01.984863191 21695   0x556aa94a30 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 1]: build engine file failed

I am using Jetpack 4.4 with Deepstream 5.0 and Tensorrt 7.1.3.0. Is this supposed to work? Thanks!

yaoyh commented 3 years ago

I have the same problem

tlalexander commented 3 years ago

Hello! See the solution in this post: https://forums.developer.nvidia.com/t/maskrcnn-on-xavier-uffparser-validator-error-unsupported-operation-generatedetection-trt/159815/10?u=tlalexander