OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.82k stars 543 forks source link

mlc_chat convert_weight的时候报错/mlc-mini/build/libmlc_llm_module.dylib, 0x0006): Symbol not found: __ZN3tvm7runtime7NDArray10CreateViewENS0_10ShapeTupleE10DLDataType #276

Open zhb-code opened 2 weeks ago

zhb-code commented 2 weeks ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

我准备在Android客户端部署MiniCPM,现在遇到问题,希望官方大佬帮忙解决一下,感谢!

期望行为 | Expected Behavior

现在mlc_chat convert_weight时一直报错,期望可以执行通过;

复现方法 | Steps To Reproduce

1、配置好环境变量 2、按照官方文档执行 mkdir -p build && cd build

generate build configuration

python3 ../cmake/gen_cmake_config.py && cd ..

build mlc_chat_cli

cd build && cmake .. && cmake --build . --parallel $(nproc) && cd ..

install

cd python && pip install -e . && cd .. 到这一步都正常 3、model是从https://modelscope.cn/models/OpenBMB/miniCPM-bf16/files这里下载的 3、继续执行mlc_chat convert_weight --model-type ${MODEL_TYPE} ./dist/models/${MODEL_NAME}-hf/ --quantization $QUANTIZATION -o dist/$MODEL_NAME/ 这一步的时候就报错了

报错信息: OSError: dlopen(/Users/zhb/Documents/StudioWorkSpace/mlc-mini/build/libmlc_llm_module.dylib, 0x0006): Symbol not found: __ZN3tvm7runtime7NDArray10CreateViewENS0_10ShapeTupleE10DLDataType Referenced from: /Users/zhb/Documents/StudioWorkSpace/mlc-mini/build/libmlc_llm_module.dylib Expected in: <533BA3B9-AEB8-3E4C-94D9-6E70457BEDF3> /opt/miniconda3/envs/mlc-mini/lib/python3.11/site-packages/tvm/libtvm.dylib

该如何解决?

运行环境 | Environment

- OS:macox m3pro
- Python:3.11
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response