NVIDIA-AI-IOT / deepstream_triton_model_deploy

How to deploy open source models using DeepStream and Triton Inference Server
Apache License 2.0
73 stars 15 forks source link

Unexpected platform and fail to load model (Solved onnx with triton not suported on Jetson) #10

Closed ncw16 closed 3 years ago

ncw16 commented 3 years ago

Hello,

I am encountering an error when running "deepstream-app -c source1_primary_detector.txt"

Regarding the model, I followed the Read.me and ran "run.sh" and have kept the model in the directory indicated in this issue: https://github.com/NVIDIA-AI-IOT/deepstream_triton_model_deploy/issues/5

I haven't made any changes to the centerface.txt file.

Thanks in advance for any help.

Here is the error.

E0208 21:41:07.509156 20068 model_repository_manager.cc:1519] unexpected platform type onnxruntime_onnx for centerface ERROR: TRTIS: failed to load model centerface, trtis_err_str:INTERNAL, err_msg:failed to load 'centerface', no version is available ERROR: failed to load model: centerface, nvinfer error:NVDSINFER_TRTIS_ERROR ERROR: failed to initialize backend while ensuring model:centerface ready, nvinfer error:NVDSINFER_TRTIS_ERROR 0:00:04.427607013 20068 0x35fdeac0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger: nvinferserver[UID 1]: Error in createNNBackend() [UID = 1]: failed to initialize trtis backend for model:centerface, nvinfer error:NVDSINFER_TRTIS_ERROR I0208 21:41:07.509727 20068 server.cc:179] Waiting for in-flight inferences to complete. I0208 21:41:07.509778 20068 server.cc:194] Timeout 30: Found 0 live models and 0 in-flight requests 0:00:04.427822468 20068 0x35fdeac0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger: nvinferserver[UID 1]: Error in initialize() [UID = 1]: create nn-backend failed, check config file settings, nvinfer error:NVDSINFER_TRTIS_ERROR

mjhuria commented 3 years ago

Hi,

Could you please print the dir. tree?

ncw16 commented 3 years ago

Hi @mjhuria

Here is the dir. tree

. ├── centerface │   ├── 1 │   │   ├── change_dim.py │   │   ├── model.onnx │   │   └── run.sh │   ├── centerface_labels.txt │   └── config.pbtxt ├── centerface_output.png ├── config │   ├── centerface.txt │   ├── out.mp4 │   └── source1_primary_detector.txt ├── customparser │   ├── customparserbbox_centernet.cpp │   ├── libnvds_infercustomparser_centernet.so │   └── Makefile └── README.md

monjha commented 3 years ago

What system are you using?

ncw16 commented 3 years ago

I'm using Jetson Xavier NX developer Kit

monjha commented 3 years ago

onnx with triton inference server on jetson is not supported.

ncw16 commented 3 years ago

I see, thanks for the help @monjha