Closed xhzheng1895 closed 2 months ago
相同的问题,期待解决方案
We are no longer actively maintaining the mlc-MiniCPM repository, but the original repository for mlc-llm is still being updated continuously, so there may be some compatibility issues.
We recommend trying out our llama.cpp version:
Hi, I follow this page https://github.com/OpenBMB/mlc-MiniCPM?tab=readme-ov-file to build android app, but undefined symbol reported when compile model: run
mlc_chat convert_weight --model-type ${MODEL_TYPE} ./dist/models/${MODEL_NAME}-hf/ --quantization $QUANTIZATION -o dist/$MODEL_NAME/
But getcheck dependency:
ldd libmlc_llm_module.so
get:check symbol:
nm -D tvm/libtvm.so | grep _ZN3tvm7runtime7NDArray10CreateViewENS0_10ShapeTupleE10DLDataType
get:000000000013ca30 T _ZN3tvm7runtime7NDArray10CreateViewENS0_10ShapeTupleE10DLDataType
My environment: OS: ubuntu20.04 python version: Python 3.11.9 Others all follow https://llm.mlc.ai/docs/deploy/android.html
Wish to get some help on that, thanks a lot!