johnolafenwa / DeepStack

The World's Leading Cross Platform AI Engine for Edge Devices
Apache License 2.0
694 stars 108 forks source link

deepstack-lite? #12

Closed robmarkcole closed 3 years ago

robmarkcole commented 4 years ago

I am not expecting this to be a high priority, but I mentioned I wanted fast inferencing from an RPi and didn't want to require a USB stick for accel. Well I achieved this using tensorflow-lite in the repo below. My thought was this could be rolled into deepstack or a branch at some point, possibly as a deepstack:lite container or similar. Any thoughts?

https://github.com/robmarkcole/tensorflow-lite-rest-server

johnolafenwa commented 4 years ago

This is cool. How fast is it and what is the accuracy of the detections like ?

robmarkcole commented 4 years ago

Speed will depend on the platform clearly and I haven't done any benchmarking, but response is instantaneous on mac. RE accuracy there is a chart in this link which indicates accuracy around 75-80%.

johnolafenwa commented 4 years ago

Okay. Cool. Will investigate this further especially as the speed on the PI is often much slower compared to a mac

robmarkcole commented 3 years ago

I think this is best served by a separate project, we dont want to maintain a fork. Closing