pytorch / executorch

On-device AI across mobile, embedded and edge for PyTorch
https://pytorch.org/executorch/
Other
1.46k stars 243 forks source link

Model Deployment using executorch on Android devices with QNN #2953

Open miverco-coder opened 3 months ago

miverco-coder commented 3 months ago

Hi team. Generic license question about deployment.

We have successfully prototyped usage of QNN to deploy models on Android devices with Snapdragons. In order to use the latest QNN SDK we had to carry with us the QC libraries (libqnn*.so and all the dsp skel files) in order to use the model. The QNN license is pretty broad and requires attribution and also usage tracking (our lawyer suspected they used a generic license also used for codecs) This license terms is what stopped us from finally productizing the usage of QNN.

Now with executor it seems I can access also the DSP's via QNN, but the BSD license is more permissive (does not have the tracking requirement) Question: 1- is that the case? can we use QNN under executorch with only the BSD license? 2- to deploy the model we still need to push all the QNN libs correct?

Thanks in advance

Miguel

chiwwang commented 1 month ago

Hi @miverco-coder ,

Finally, I can come back to this question... I'm not a lawyer, but I'm notified that QNN updates license after 2.23. If you happen to have time, you might want to check it.

1- is that the case? can we use QNN under executorch with only the BSD license?

No.. I'm afraid that QNN license is still in effect.

2- to deploy the model we still need to push all the QNN libs correct?

Yes and No. Pushing all QNN libraries to the device can work, but if you know exactly what your target device is, you can choose corresponding Vxx libraries only. e.g., https://github.com/pytorch/executorch/blob/c6d4e8baad5312230b4109296254e6670b9ca4e9/examples/qualcomm/scripts/utils.py#L62