Open kumard3 opened 5 months ago
That filepath needs to point to a location on your device.
For android, see here
For iOS, you need to find a way to push the file to local application storage, or perhaps an icloud location, if you are familiar with how that works, or potentially using the iOS platform APIs for accessing files.
The easiest way, on both platforms, is simply to bundle the model as an asset.
That filepath needs to point to a location on your device.
For android, see here
For iOS, you need to find a way to push the file to local application storage, or perhaps an icloud location, if you are familiar with how that works, or potentially using the iOS platform APIs for accessing files.
The easiest way, on both platforms, is simply to bundle the model as an asset.
Thank you I will try that, okay I have two more questions
Yes and yes.
On Wed, Jun 26, 2024 at 7:58 AM Kumar Deepanshu @.***> wrote:
That filepath needs to point to a location on your device.
For android, see here https://ai.google.dev/edge/mediapipe/solutions/genai/llm_inference/android#push_model_to_the_device
For iOS, you need to find a way to push the file to local application storage, or perhaps an icloud location, if you are familiar with how that works, or potentially using the iOS platform APIs for accessing files.
The easiest way, on both platforms, is simply to bundle the model as an asset.
Thank you I will try that, okay I have two more questions
- What is the type asset ?
- The .bin model, is the right one , right ?
— Reply to this email directly, view it on GitHub https://github.com/cdiddy77/react-native-llm-mediapipe/issues/7#issuecomment-2191510203, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABDG2KJY7E57JBCI5VGRNRLZJKUGDAVCNFSM6AAAAABJ4XDTX2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOJRGUYTAMRQGM . You are receiving this because you commented.Message ID: @.***>
getting this error createModel [Error: internal: Failed to initialize session: %sCan not open OpenCL library on this device - undefined symbol: clSetPerfHintQCOM]
i have tried it on physical pixel 7 and emulator pixel 8 pro.
Ever figure this out - also having a difficult time figuring out how to bundle the model as an asset or access via a file path to the model on the device.
To bundle, have downloaded a model and included in an assets folder, tried putting it in android/app/src/main/assets, tried putting it in a models/converted folder as well as in the same folder as the file calling the function.
With storageType of file - have put the model on my device and tried accessing with /data/local/tmp/llm/gemma-2b-it-cpu-int4.bin as well as moving it to other locations trying different variations of the file path.
Most interested in getting it working as a bundled asset though. Any help/pointes appreciated. Thanks.
Hey guys, I just pushed a new PR attempting to fix the problem of OpenCL library and also commenting about where to put the model files to make it work. I hope this fixes your problems and clear up your doubts 😄
That filepath needs to point to a location on your device.
For android, see here
For iOS, you need to find a way to push the file to local application storage, or perhaps an icloud location, if you are familiar with how that works, or potentially using the iOS platform APIs for accessing files.
The easiest way, on both platforms, is simply to bundle the model as an asset.
@cdiddy77 I'm setting up my llmInference like this:
const llmInference = useLlmInference({
storageType: 'file',
modelPath: '/data/user/0/com.offlinellmpoc/files/gemma-2b-it-cpu-int4.bin',
});
But my app is getting crashed. I'm not able to get what the issue is
That filepath needs to point to a location on your device.
For android, see here
For iOS, you need to find a way to push the file to local application storage, or perhaps an icloud location, if you are familiar with how that works, or potentially using the iOS platform APIs for accessing files.
The easiest way, on both platforms, is simply to bundle the model as an asset.
I am relatively new to react native and mobile app development. How does one bundle the model as an asset? According to google docs:
Note: During development, you can use adb to push the model to your test device for a simpler workflow. For deployment, host the model on a server and download it at runtime. The model is too large to be bundled in an APK.
I agree that bundling it as an asset is the best but I do not know how to do it. Can you show me how?
@luey-punch I was able to do that by creating a folder named 'assets' inside android/app/src/main and moving the model inside it although it worked, my android was lagging a lot.
const {generateResponse} = useLlmInference({ storageType: 'asset', modelName: 'gemma.bin', });
hi getting this error while using the package , i might be using it wrong, can you help me with