jkjung-avt / tensorrt_demos

TensorRT MODNet, YOLOv4, YOLOv3, SSD, MTCNN, and GoogLeNet
https://jkjung-avt.github.io/
MIT License
1.75k stars 547 forks source link

Stream Inference Output #502

Closed seabass1217 closed 2 years ago

seabass1217 commented 2 years ago

Hi,

I would like to stream the inference output so users can access them via a web-browser. Similar to AlexeyAB / darknet (https://github.com/AlexeyAB/darknet#how-to-use-on-the-command-line) project where users can access predictions in JSON format via a web-browser.

Is there a simple python HTTP/JSON server (or something of the like) that allows inference output to be streamed and access with a browser? The stream can be in any format, JSON, XML, or whatever.

Any help or suggested solutions are greatly appreciated.

Thanks!

jkjung-avt commented 2 years ago

Yes, I have done something similar in this repo. The trt_yolo_mjpeg.py uses a HTTP server to stream MJPEG images to a remote client.

You could refer to the HTTP server implementation in utils/mjpeg.py. It should not be too difficult to extend the MjpegServer and MjpegHandler classes to support GET of JSON detection results.

seabass1217 commented 2 years ago

@jkjung-avt,

Thank you for your feedback! I'm fairly new to HTTP/MJPEG Server but will research your suggestion. Any reference implementation would be helpful.

Again thank you and I will close this issue.