OpenBMB / MiniCPM

MiniCPM3-4B: An edge-side LLM that surpasses GPT-3.5-Turbo.
Apache License 2.0
6.95k stars 440 forks source link

Android build failed #144

Closed xhzheng1895 closed 2 months ago

xhzheng1895 commented 3 months ago

Hi, I follow this page https://github.com/OpenBMB/mlc-MiniCPM?tab=readme-ov-file to build android app, but undefined symbol reported when compile model: run mlc_chat convert_weight --model-type ${MODEL_TYPE} ./dist/models/${MODEL_NAME}-hf/ --quantization $QUANTIZATION -o dist/$MODEL_NAME/ But get

  File "/data_sdb/demos/mlc-MiniCPM/python/mlc_chat/__init__.py", line 5, in <module>
    from .chat_module import ChatConfig, ChatModule, ConvConfig, GenerationConfig
  File "/data_sdb/demos/mlc-MiniCPM/python/mlc_chat/chat_module.py", line 20, in <module>
    from . import base as _
  File "/data_sdb/demos/mlc-MiniCPM/python/mlc_chat/base.py", line 28, in <module>
    _LIB, _LIB_PATH = _load_mlc_llm_lib()
                      ^^^^^^^^^^^^^^^^^^^
  File "/data_sdb/demos/mlc-MiniCPM/python/mlc_chat/base.py", line 23, in _load_mlc_llm_lib
    return ctypes.CDLL(lib_path[0]), lib_path[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/ctypes/__init__.py", line 376, in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: /data_sdb/demos/mlc-MiniCPM/build/libmlc_llm_module.so: undefined symbol: _ZN3tvm7runtime7NDArray10CreateViewENS0_10ShapeTupleE10DLDataType

check dependency: ldd libmlc_llm_module.so get:

        linux-vdso.so.1 (0x00007ffe2f75c000)
        libtvm.so => /data_sdb/demos/mlc-MiniCPM/build/tvm/libtvm.so (0x00007fe3362b9000)
        libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fe336294000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fe336271000)
        libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fe33608f000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fe335f40000)
        libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fe335f23000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fe335d31000)
        /lib64/ld-linux-x86-64.so.2 (0x00007fe336d72000)

check symbol: nm -D tvm/libtvm.so | grep _ZN3tvm7runtime7NDArray10CreateViewENS0_10ShapeTupleE10DLDataType get: 000000000013ca30 T _ZN3tvm7runtime7NDArray10CreateViewENS0_10ShapeTupleE10DLDataType

My environment: OS: ubuntu20.04 python version: Python 3.11.9 Others all follow https://llm.mlc.ai/docs/deploy/android.html

Wish to get some help on that, thanks a lot!

Single430 commented 3 months ago

相同的问题,期待解决方案

Achazwl commented 3 months ago

We are no longer actively maintaining the mlc-MiniCPM repository, but the original repository for mlc-llm is still being updated continuously, so there may be some compatibility issues.

We recommend trying out our llama.cpp version:

  1. Minicpm-Llama-3-V 2.5: llama.cpp/examples/minicpmv/README.md at minicpm-v2.5 · OpenBMB/llama.cpp (github.com)
  2. Minicpm-V 2: llama.cpp/examples/minicpmv at feat-minicpmv · Achazwl/llama.cpp (github.com)