deepstreamIO / deepstream.io

deepstream.io server
https://deepstreamio.github.io
MIT License
7.13k stars 382 forks source link

NVidia DeepStream Output Inference Class Mismatch - “Vehicle Class” #1097

Closed crazyoutlook closed 3 years ago

crazyoutlook commented 3 years ago

• Hardware Platform (Jetson / GPU) Jetson Nano 4GB, Ubuntu 18.4

• DeepStream Version marketplace.azurecr.io/nvidia/deepstream-iot2-l4t:latest

• JetPack Version 4.3

• Issue Type Output inference class is different from Model class

• How to reproduce the issue ? On DeepStream, deploy a object detection ONNX model. My model is ONNX model exported from Azure Custom Vision. My label file has 2 classes - 'Mask', 'No_Mask'. Deployment works fine and I am able to execute my model using DeepStream. However, output inference class I am getting as 'Vehicle' and 'No_Mask'. Can you please help me understand why I am getting output inference label as "Vehicle" when it is not there in my Model.

Sample output inference log {"log":" "1|324|23|380|61|Vehicle|#|||||||0"\n","stream":"stdout","time":"2021-01-05T16:15:15.614591738Z"}

{"log":" "1|324|23|380|61|Vehicle|#|||||||0"\n","stream":"stdout","time":"2021-01-05T16:15:15.614790179Z"}

{"log":" "2|141|15|365|161|No Mask"\n","stream":"stdout","time":"2021-01-05T16:15:15.614221209Z"}

slachtar commented 3 years ago

I think you posted in the wrong place. This is the deepstream realtime server and not the nvidia deepstream.

On Wed, Mar 3, 2021 at 6:36 AM crazyoutlook notifications@github.com wrote:

• Hardware Platform (Jetson / GPU) Jetson Nano 4GB, Ubuntu 18.4

• DeepStream Version marketplace.azurecr.io/nvidia/deepstream-iot2-l4t:latest

• JetPack Version 4.3

• Issue Type Output inference class is different from Model class

• How to reproduce the issue ? On DeepStream, deploy a object detection ONNX model. My model is ONNX model exported from Azure Custom Vision. My label file has 2 classes - 'Mask', 'No_Mask'. Deployment works fine and I am able to execute my model using DeepStream. However, output inference class I am getting as 'Vehicle' and 'No_Mask'. Can you please help me understand why I am getting output inference label as "Vehicle" when it is not there in my Model.

Sample output inference log {"log":" "1|324|23|380|61|Vehicle|#|||||||0"\n","stream":"stdout","time":"2021-01-05T16:15:15.614591738Z"}

{"log":" "1|324|23|380|61|Vehicle|#|||||||0"\n","stream":"stdout","time":"2021-01-05T16:15:15.614790179Z"}

{"log":" "2|141|15|365|161|No Mask"\n","stream":"stdout","time":"2021-01-05T16:15:15.614221209Z"}

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/deepstreamIO/deepstream.io/issues/1097, or unsubscribe https://github.com/notifications/unsubscribe-auth/AATKH2SK465WJ6XIBZ67FW3TBXDFVANCNFSM4YQRSWKA .

-- Slah Lachtar Tel. (+216) 98 221 575