johnolafenwa / DeepStack

The World's Leading Cross Platform AI Engine for Edge Devices
Apache License 2.0
692 stars 108 forks source link

Using better detection model on ARM devices #141

Closed francesco-re-1107 closed 2 years ago

francesco-re-1107 commented 2 years ago

I setup an ARM vm on Oracle Cloud and installed Deepstack. It works great but I found out that on ARM devices Deepstack uses the yolov5s model instead of the yolov5m. I know that usually ARM processors are used on low power devices but actually my vm is pretty powerful and has no problems running even the ExDark model which uses the yolov5x with a good inference time.

Now, I was wondering if it's possible to run Deepstack on ARM with a better object detection model (like the yolov5m). I tried downloading the .pt files from here and using them as custom models but they're not recognized. Any help is appreciated!

P.S. Thanks for your great work on this project

johnolafenwa commented 2 years ago

Hello @francesco-re-1107 , thanks a lot for this feedback. Adding support for this is trivial from our end and we would release a standard ARM64 version of DeepStack soon with same models as the standard cpu and gpu versions.

johnolafenwa commented 2 years ago

@francesco-re-1107 , as promised, find the arm64-server docker image here, https://docs.deepstack.cc/arm64/index.html#deepstack-on-arm64-servers