larq / compute-engine

Highly optimized inference engine for Binarized Neural Networks
https://docs.larq.dev/compute-engine
Apache License 2.0
243 stars 35 forks source link

Deployment on Cortex-M #688

Closed mengna0707 closed 2 years ago

mengna0707 commented 3 years ago

Hi, I'm in a bit of a hurry, so I‘m asking here who I should contact for details if I want to use the deployment tool on the ARM Cortex-M family of MCUs.

CNugteren commented 3 years ago

Hello mengna0707. Thank you for your interest in the Larq Compute Engine.

The open-source version of the Larq Compute Engine does not support deployment on Cortex-M. If you are interested in inference on Cortex-M, please contact our sales team at hello@plumerai.com and we'll get into contact with you. In your email please provide relevant details such as the kinds of neural networks you are planning to use, the exact target board(s) and an estimation of the amount of devices you are planning to deploy onto.

mengna0707 commented 3 years ago

Hi CNugteren,thank you very much for your contact information. I have sent an email to inquire about relevant matters and look forward to receiving the reply.