mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
19.09k stars 1.56k forks source link

[Bug] TVM runtime cannot find vm_load_executable #1535

Closed xuqiang-idotools closed 8 months ago

xuqiang-idotools commented 10 months ago

🐛 Bug

I build the library libtvm4j_runtime_packed.so with the prebuild tar as below: https://github.com/mlc-ai/binary-mlc-llm-libs/blob/main/Mistral-7B-Instruct-v0.2/Mistral-7B-Instruct-v0.2-q4f16_1-android.tar

And run the android app got the error as below: cpp/llm_chat.cc:161: InternalError: Check failed: (fload_exec.defined()) is false: TVM runtime cannot find vm_load_executable

To Reproduce

Steps to reproduce the behavior:

  1. Downloading the prebuild tar: Mistral-7B-Instruct-v0.2-q4f16_1-android.tar
  2. set env as below: export ANDROID_NDK="\~/Library/Android/sdk/ndk/25.2.9519653" export TVM_NDK_CC="\~/Library/Android/sdk/ndk/25.2.9519653/toolchains/llvm/prebuilt/darwin-x86_64/bin/x86_64-linux-android24-clang" export TVM_HOME="\~/tvm/mlc-llm/3rdparty/tvm"
  3. run the script prepare_libs.sh
  4. copy so and jar to android project and run it.
  5. got the error: vm_load_executable

Expected behavior

Run the android app successfully.

Environment

Additional context

I always got the error whatever using prebuild tar or compile my own tar.

Hzfengsy commented 10 months ago

You need to edit android/library/src/main/assets/app-config.json before prepare_libs.sh, following the docs https://llm.mlc.ai/docs/deploy/android.html

xuqiang-idotools commented 10 months ago

Yes, I had already modified it as below: { "model_libs": [ "Mistral-7B-Instruct-v0.2-q4f16_1" ], "model_list": [ { "model_url": "https://huggingface.co/mlc-ai/mlc-chat-Mistral-7B-Instruct-v0.2-q4f16_1/", "local_id": "Mistral-7B-Instruct-v0.2-q4f16_1" } ], "add_model_samples": [] }

Nick-infinity commented 10 months ago

Answered here : https://github.com/mlc-ai/mlc-llm/issues/1517

CharlieFRuan commented 8 months ago

Closing this one for now. Feel free to open another one if issues persist!