robmarkcole / coral-pi-rest-server

Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
https://coral.ai/products/accelerator
MIT License
67 stars 20 forks source link

Add docker container #53

Open robmarkcole opened 4 years ago

robmarkcole commented 4 years ago

https://blogs.sap.com/2020/02/11/containerizing-a-tensorflow-lite-edge-tpu-ml-application-with-hardware-access-on-raspbian/

yayitazale commented 2 years ago

I will love to use this in a docker container to classify birds detected by Frigate.

sstratoti commented 1 year ago

So I'm working on trying to swap my deepstack end points with just using a coral end point.

I've got things working in docker, and if it's a 1-1 camera->coral-deepstack, it works fine. But if I try to connect more than one camera and process more than one image, it gives a 500 error. Which, I think makes sense because I think the script isn't designed to handle more than one request at a time.

Any thoughts on adding multiprocessing or a queue of some sort to the flask configuration? I've been looking into it, but the python is a bit over my head. :/

robmarkcole commented 1 year ago

An approach like this https://github.com/robmarkcole/ServingMLFastCelery You can also see how deepstack does it