microsoft / azure-percept-advanced-development

Azure Percept DK advanced topics
Other
69 stars 34 forks source link

Add inference script to the deployed model/ access the video frame #51

Closed mouhannadali closed 3 years ago

mouhannadali commented 3 years ago

Hi, is there a way to add an inference script (like when deploying a model in azure ml) to do per/postprocessing? if not is there a way to access the frames of the camera video stream?

what I want to achieve is to use a pre-trained model deployed in a container to detect an object in an image frame and then do some postprocessing and send the result with the image frame to another deployed container

MaxStrange commented 3 years ago

The azureeyemodule is responsible for the inferencing on the video stream. You can create another container and listen to its messages with that container.

You can follow the tutorials in this repo for figuring out how to add your own custom model, or you can use one of the pretrained ones we have. Unfortunately, we don't seem to have gotten around to documenting the message schema (as far as I can recall). But you can peruse the source code of the pretrained models to see what it looks like: https://github.com/microsoft/azure-percept-advanced-development/tree/main/azureeyemodule/app/model. Alternatively, just deploy the model of interest and view its IoT messages using something like VS Code's Azure IoT Edge extension.

mouhannadali commented 3 years ago

thanks @MaxStrange for the replay. in case I create another container, is there an easy way to listen to the same camera video stream?

MaxStrange commented 3 years ago

Yes, we output a raw (i.e. no inference mark-ups like bounding boxes) stream encoded as MJPEG as well as one encoded as H.264, and we also output a stream that has inference mark ups encoded as MJPEG.

These are RTSP streams and are found at rtsp://0.0.0.0:8554/raw. Replace the /raw with /result for the inference stream, or /h264raw for the H.264-encoded stream. Please note that the H.264 stream is currently broken, but there is a pull request in to fix it and it should get fixed as part of the next release (it currently crashes after about 7 minutes of streaming it).

MaxStrange commented 3 years ago

Closing due to inactivity. Please open again if you still need help.