google-ai-edge / mediapipe

Cross-platform, customizable ML solutions for live and streaming media.
https://mediapipe.dev
Apache License 2.0
26.8k stars 5.09k forks source link

Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null #5310

Closed Vasanthengineer4949 closed 4 months ago

Vasanthengineer4949 commented 5 months ago
                                                                                                Common causes for lock verification issues are non-optimized dex code
                                                                                                and incorrect proguard optimizations.

2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method boolean androidx.compose.runtime.snapshots.SnapshotStateList.conditionalUpdate$default(androidx.compose.runtime.snapshots.SnapshotStateList, boolean, kotlin.jvm.functions.Function1, int, java.lang.Object) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method java.lang.Object androidx.compose.runtime.snapshots.SnapshotStateList.mutate(kotlin.jvm.functions.Function1) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method void androidx.compose.runtime.snapshots.SnapshotStateList.update(boolean, kotlin.jvm.functions.Function1) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method void androidx.compose.runtime.snapshots.SnapshotStateList.update$default(androidx.compose.runtime.snapshots.SnapshotStateList, boolean, kotlin.jvm.functions.Function1, int, java.lang.Object) failed lock verification and will run slower. 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A F0000 00:00:1712808445.094404 8076 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A terminating. 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A F0000 00:00:1712808445.094404 8076 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A terminating. 2024-04-11 09:37:25.095 8002-8076 libc com...diapipe.examples.llminference A Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 8076 (DefaultDispatch), pid 8002 (es.llminference)

Saying my LLM is null eventhough I have it converted like it was said

volcano1216 commented 5 months ago

encounter same issue

kuaashish commented 5 months ago

Hi @Vasanthengineer4949,

Could you please attempt the suggestions provided by @yuimo? If the issue persists, kindly furnish the following details for a better understanding and potential issue replication:

  1. Detailed steps you are following, referring to the documentation.
  2. Operating System (OS) specifics including version.
  3. Android Studio version in use.

Thank you!!

jeffxchu commented 5 months ago

We stuck into a similar error. I was trying to run the mediapipe llm example from Google.

Here is the error message: 29335 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null

@yuimo could you let us know which model did you download and replace?

schmidt-sebastian commented 5 months ago

Which file are your downloading? https://www.kaggle.com/models/google/gemma/tfLite/gemma-2b-it-gpu-int4 should work on Android (among others).

kuaashish commented 4 months ago

Hi @Vasanthengineer4949,

Could you kindly review the previous comment and provide us with the required information?

Thank you!!

github-actions[bot] commented 4 months ago

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

github-actions[bot] commented 4 months ago

This issue was closed due to lack of activity after being marked stale for past 7 days.

google-ml-butler[bot] commented 4 months ago

Are you satisfied with the resolution of your issue? Yes No

yuimo commented 4 months ago

We stuck into a similar error. I was trying to run the mediapipe llm example from Google.

Here is the error message: 29335 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null

@yuimo could you let us know which model did you download and replace?

we use the official model provided by google --- https://www.kaggle.com/models/google/gemma/tfLite/gemma-2b-it-gpu-int4 our observation: redownload and replace the model, then it works. but after a few days or a few hours, it will not work again. Sometimes even if we replace the original model, it still doesn't work. Strange phenomenon, looking forward to your reply

Fr1z commented 3 months ago

same problem here, using kaggle TFLite bins. could it be an Android FS Restriction?

Fr1z commented 3 months ago

I figured out selecting the model with rememberLauncherForActivityResult(contract = ActivityResultContracts.OpenDocument()).. and then copying with FileOutputStream to my application data directory gived by context.applicationContext.filesDir

On my android 13 the model is loaded without this error only from this location. hope it helps