google-ai-edge / mediapipe-samples

Apache License 2.0
1.63k stars 419 forks source link

Not generate text with gemma-2b model #446

Closed madroidmaq closed 1 week ago

madroidmaq commented 2 months ago

Unable to successfully perform inference tasks on Google Pixel 4 device, the error message is as follows:

17:04:57.271 Remote...onImpl  W  requestCursorAnchorInfo on inactive InputConnection
17:04:57.452 OnBack...llback  W  OnBackInvokedCallback is not enabled for the application.
                                 Set 'android:enableOnBackInvokedCallback="true"' in the application manifest.
17:04:57.483 Insets...roller  D  show(ime(), fromIme=true)
17:05:01.814 Remote...onImpl  W  requestCursorAnchorInfo on inactive InputConnection
17:05:01.823 Remote...onImpl  W  requestCursorAnchorInfo on inactive InputConnection
17:05:01.857 Remote...onImpl  W  requestCursorAnchorInfo on inactive InputConnection
17:05:01.900 Remote...onImpl  W  getSurroundingText on inactive InputConnection
17:05:02.021 libc             A  Fatal signal 11 (SIGSEGV), code 2 (SEGV_ACCERR), fault addr 0x777a7e4000 in tid 6633 (mediapipe/6633), pid 6596 (es.llminference)
17:05:02.047 Remote...onImpl  W  requestCursorAnchorInfo on inactive InputConnection
17:05:02.062 Remote...onImpl  W  requestCursorAnchorInfo on inactive InputConnection
17:05:02.068 Insets...roller  D  show(ime(), fromIme=true)
17:05:02.118 Remote...onImpl  W  requestCursorAnchorInfo on inactive InputConnection
---------------------------- PROCESS STARTED (6666) for package com.google.mediapipe.examples.llminference ----------------------------
17:05:02.369 DEBUG            A  Cmdline: com.google.mediapipe.examples.llminference
17:05:02.369 DEBUG            A  pid: 6596, tid: 6633, name: mediapipe/6633  >>> com.google.mediapipe.examples.llminference <<<
17:05:02.369 DEBUG            A        #00 pc 00000000003e1614  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #01 pc 000000000037f650  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #02 pc 00000000003ff028  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #03 pc 00000000004025b8  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #04 pc 00000000003fedac  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #05 pc 00000000003800d4  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #06 pc 000000000036a868  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #07 pc 00000000000e088c  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #08 pc 00000000000cdb40  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #09 pc 00000000000bd038  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #10 pc 0000000000458408  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #11 pc 000000000043fd7c  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #12 pc 000000000043fa48  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #13 pc 00000000004544d0  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
17:05:02.369 DEBUG            A        #14 pc 0000000000454074  /data/app/~~KDGLoEYpcCDWDrQ8azrhyg==/com.google.mediapipe.examples.llminference-0NHsZ8brHDRoxg0ltksmYg==/base.apk!libllm_inference_engine_jni.so
---------------------------- PROCESS ENDED (6596) for package com.google.mediapipe.examples.llminference ----------------------------
---------------------------- PROCESS ENDED (6666) for package com.google.mediapipe.examples.llminference ----------------------------
smithlai commented 2 months ago

Me, too. Device: Pixel 7 Pro Android14 Model: gemma-2b-it-cpu-int4.bin / gemma-1.1-2b-it-cpu-int4.bin

However, the gemma-2b-it-gpu-int4.bin works

BalajiPolisetty2207 commented 1 month ago

I hope you took latest changes of llminference from the examples.

https://github.com/google-ai-edge/mediapipe-samples/blob/main/examples/llm_inference/android/app/src/main/java/com/google/mediapipe/examples/llminference/InferenceModel.kt

schmidt-sebastian commented 1 week ago

Can you measure the memory usage during the inference call? We are working on expanding our hardware support, but we are currently mostly focused on bringing inference to recent set of Android phones that provide more memory. If you are targeting Pixel 4, I would suggest using a smaller LLM such as Falcon 1B or the even smaller models outlined here: https://github.com/google-ai-edge/ai-edge-torch/tree/main/ai_edge_torch/generative/examples/smollm