Closed Weisonorz closed 3 months ago
I'm sorry I don't know what's the exact meaning you said. the whisper.cpp and llama.cpp can works both well on Qualcomm mobile SoC equipped Android phone.
updated on 06/14/2024: I think I know what you said. there is a simple/direct approach I provided in this project as following,
pls refer to:
this is a simple UT of whisper inference in Android command line mode which I use it for verify ggml qnn backend on Qualcomm mobile SoC equipped Android phone, but it also works well in Android APK(you can verify in the Android APK which built from the source code of this project).
for real AI application/scenario in commercial Android APK, pls refer to realtime AI subtitle powered by whisper.cpp for online TV in this project.
I'd like to close this opening issue accordingly. of course, you can re-open it in the future as your need.thanks for your understanding.
Thanks for your work. I am new on using Qualcomm SoC systems. Just curious how can we implement whisper or llama cpp separately from this project?