google-ai-edge / mediapipe-samples

Apache License 2.0
1.52k stars 398 forks source link

library "libllm_inference_engine_jni.so" not found - LLM inference for Android Example. #426

Closed sachinsshetty closed 3 weeks ago

sachinsshetty commented 2 months ago

I followed the example at https://ai.google.dev/edge/mediapipe/solutions/genai/llm_inference/android

On running the example on emulator and on samsung device. Getting the missing library error.

2024-07-31 00:33:39.382 6923-6923 AndroidRuntime com...diapipe.examples.llminference E FATAL EXCEPTION: main Process: com.google.mediapipe.examples.llminference, PID: 6923 java.lang.UnsatisfiedLinkError: dlopen failed: library "libllm_inference_engine_jni.so" not found at java.lang.Runtime.loadLibrary0(Runtime.java:1081) at java.lang.Runtime.loadLibrary0(Runtime.java:1003) at java.lang.System.loadLibrary(System.java:1765) at com.google.mediapipe.tasks.genai.llminference.LlmInference.(LlmInference.java:28) at com.google.mediapipe.examples.llminference.InferenceModel.(InferenceModel.kt:36) at com.google.mediapipe.examples.llminference.InferenceModel.(Unknown Source:0) at com.google.mediapipe.examples.llminference.InferenceModel$Companion.getInstance(InferenceModel.kt:51) at com.google.mediapipe.examples.llminference.LoadingScreenKt$LoadingRoute$2$1.invokeSuspend(LoadingScreen.kt:42) at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) at kotlinx.coroutines.internal.LimitedDispatcher$Worker.run(LimitedDispatcher.kt:115) at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:100) at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684) Suppressed: kotlinx.coroutines.internal.DiagnosticCoroutineContextException: [androidx.compose.ui.platform.Mo

shubham0204 commented 2 months ago

@sachinsshetty The issue seems to arise because of the incompatibility of Mediapipe LLM Inference API for ARM-32 (armeabi-v7a) devices. See this issue I created a few days ago.

PaulTR commented 3 weeks ago

Closing this out as I've put a feature request in for internal tracking around armeabi-v7a support.