Closed Vali-98 closed 9 months ago
In my understanding, not all Android devices supported OpenCL, for example the Google Pixel does not.
I think a better solution is to use Vulkan, which llama.cpp should support soon. But I think their support for iGPU is not enough (currently Vulkan features for Android devices are usually limited).
Though I would prefer a mature solution in OpenCL, Vulkan does seem to be a bit more future proofed. Perhaps this is simply some impatience from my end to have improved prompt processing, but Vulkan does seem to be a still far out. As it stands, some users for my project would also like to see improved processing on Android.
Granted, I am not the repo maintainer so your perspective on implementation is far more valid. I suppose I will continue working on a forked version to get opencl functional, thanks for the response.
vulkan is now merged, is it possible to bring for android as an option?
Likely not yet, I've heard word that many mobile GPUs still are not functional.
I will be closing this issue, as now its simply waiting until vulkan's mobile implementation is done.
First of all, thanks for the hard work on bringing this project to the react-native ecosystem.
I have been using llama.rn for a few weeks now in my personal project: https://github.com/Vali-98/ChatterUI
I was wondering if there is any interest in implementing OpenCL for android. I have attempted to work on it myself to little success, given my inexperience with native modules.