Closed Strider-Alex closed 4 weeks ago
Unfortunately I don't have the means to properly update and test cui-llama.rn on IOS, as I lack any apple devices.
There is a PR on the original llama.rn with IOS sync'd with the latest llama.cpp.
https://github.com/mybigday/llama.rn/pull/79
However, it lacks the new features I added to llama.rn like querying cpu features, sync tokenize and the progress callback.
I'd like to bring new features from cui-llama.rn to iOS. Could you give me some ideas on where to start?
I'd like to bring new features from cui-llama.rn to iOS. Could you give me some ideas on where to start?
First it might be wise to familiarize yourself with native modules in react-native:
https://reactnative.dev/docs/0.72/native-modules-ios
You probably want to look at the c++ -> swift adapter such as this: https://github.com/mybigday/llama.rn/blob/3ee23e4f69c6ce67249ceb3f5e8551730799214c/ios/RNLlamaContext.mm
I think if that .mm file was updated with the new fields it would work.
Also, you will need to re-run pod install based on the llama.rn readme.
I am not an IOS dev, so I can't really help any further I will be closing the issue, but feel free to continue asking questions.
The building of cui-llama.rn seems to be broken in ios/Mac. Since cui-llama.rn doesn't have a issue section I'll leave it here.
The error: