mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
18.9k stars 1.55k forks source link

[Bug] Android app crashes after loading the model with `org.apache.tvm.Base$TVMError: InternalError: Check failed: type_code_ == kDLInt (8 vs. 0) : expected int but got Object` error #2797

Closed psb-cc closed 1 month ago

psb-cc commented 2 months ago

πŸ› Bug

The Android app crashes when I click on the 'chat' icon next to the model's name after flashing "Initialize..." for a second over the chatting interface.

To Reproduce

Steps to reproduce the behavior:

  1. Run mlc_llm convert_weight ./dist/models/phi-2/ --quantization q4f16_1 -o model_weights/phi-2_for_apk with microsoft/phi-2 model weights from HF
  2. Run mlc_llm gen_config ./dist/models/phi-2 --quantization q4f16_1 --conv-template redpajama_chat -o model_weights/phi-2_for_apk
  3. Change mlc-llm/android/MLCChat/mlc-package-config.json accordingly to include the path to phi-2_for_apk directory with converted weights and config in it. Additionally, bundle_weight is also true
  4. Run mlc_llm package
  5. Make a signed APK from Android Studio
  6. Run python bundle_weight.py --apk-path app/release/app-release.apk after connecting the Android device as suggested here
  7. Open the app.

STACK TRACE:

---------------------------- PROCESS STARTED (12368) for package ai.mlc.mlcchat ----------------------------
2024-08-12 13:18:02.793 12368-12452 ProfileInstaller        ai.mlc.mlcchat                       D  Installing profile for ai.mlc.mlcchat
2024-08-12 13:18:13.709 12368-12368 ViewRootIm...nActivity] ai.mlc.mlcchat                       I  ViewPostIme pointer 0
2024-08-12 13:18:13.777 12368-12368 ViewRootIm...nActivity] ai.mlc.mlcchat                       I  ViewPostIme pointer 1
2024-08-12 13:18:13.782 12368-12368 WindowOnBackDispatcher  ai.mlc.mlcchat                       W  OnBackInvokedCallback is not enabled for the application.
                                                                                                    Set 'android:enableOnBackInvokedCallback="true"' in the application manifest.
2024-08-12 13:18:13.841 12368-12368 Compatibil...geReporter ai.mlc.mlcchat                       D  Compat change id reported: 289878283; UID 10377; state: DISABLED
2024-08-12 13:18:13.860 12368-12368 Compatibil...geReporter ai.mlc.mlcchat                       D  Compat change id reported: 147798919; UID 10377; state: ENABLED
2024-08-12 13:18:13.863 12368-12368 Toast                   ai.mlc.mlcchat                       I  show: caller = ai.mlc.mlcchat.AppViewModel$ChatState$mainReloadChat$1$1.invokeSuspend:647 
2024-08-12 13:18:13.863 12368-12368 Toast                   ai.mlc.mlcchat                       I  show: contextDispId = 0 mCustomDisplayId = -1 focusedDisplayId = 0 isActivityContext = false
2024-08-12 13:18:15.375 12368-12409 TVM_RUNTIME             ai.mlc.mlcchat                       A  ~/projects/mlc-llm/3rdparty/tvm/include/tvm/runtime/packed_func.h:565: InternalError: Check failed: type_code_ == kDLInt (8 vs. 0) : expected int but got Object
2024-08-12 13:18:15.464 12368-12409 AndroidRuntime          ai.mlc.mlcchat                       E  FATAL EXCEPTION: Thread-5
                                                                                                    Process: ai.mlc.mlcchat, PID: 12368
                                                                                                    org.apache.tvm.Base$TVMError: InternalError: Check failed: type_code_ == kDLInt (8 vs. 0) : expected int but got Object
                                                                                                    Stack trace:
                                                                                                      File "~/projects/mlc-llm/3rdparty/tvm/include/tvm/runtime/packed_func.h", line 565

                                                                                                        at org.apache.tvm.Base.checkCall(Base.java:173)
                                                                                                        at org.apache.tvm.Function.invoke(Function.java:130)
                                                                                                        at ai.mlc.mlcllm.JSONFFIEngine.runBackgroundLoop(JSONFFIEngine.java:64)
                                                                                                        at ai.mlc.mlcllm.MLCEngine$backgroundWorker$1.invoke(MLCEngine.kt:42)
                                                                                                        at ai.mlc.mlcllm.MLCEngine$backgroundWorker$1.invoke(MLCEngine.kt:40)
                                                                                                        at ai.mlc.mlcllm.BackgroundWorker$start$1.invoke(MLCEngine.kt:19)
                                                                                                        at ai.mlc.mlcllm.BackgroundWorker$start$1.invoke(MLCEngine.kt:18)
                                                                                                        at kotlin.concurrent.ThreadsKt$thread$thread$1.run(Thread.kt:30)
2024-08-12 13:18:15.481 12368-12409 Process                 ai.mlc.mlcchat                       I  Sending signal. PID: 12368 SIG: 9
---------------------------- PROCESS ENDED (12368) for package ai.mlc.mlcchat ----------------------------

Expected behavior

The app should open the chatting interface as it does with the Demo App.

Environment

Additional context

I've also tested the pipeline with a smaller LLM Qwen/Qwen2-0.5B, but the same error.

MasterJH5574 commented 2 months ago

Hi @psb-cc, could you run git status under the mlc-llm directory to check if there's any other changes in 3rdparty/tvm? If so, you can first use git submodule update --recursive to update 3rdparty/tvm and then build the Android app again.

psb-cc commented 1 month ago

Hi @MasterJH5574, Thank you for the response. I checked, but there was no pending update in 3rdparty/tvm. However, I took a fresh pull of mlc-llm and ran the same commands again. It seems to have worked based on my initial testing with phi-2. I will test it with better hardware and post the results here. Awesome tool; thanks again!

MasterJH5574 commented 1 month ago

@psb-cc Thanks for the followup. Glad that a new pull can work :-)

Acoee commented 1 month ago

A new pull don't works on me😫