python3 -m mlc_llm.build --target android --max-seq-len 768 --model ./dist/models/BlueLM-7B-Chat --quantization q4f16_1
Traceback (most recent call last):
File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 47, in
main()
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 41, in main
parsed_args = core._parse_args(parsed_args) # pylint: disable=protected-access
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 325, in _parse_args
parsed = _setup_model_path(parsed)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 368, in _setup_model_path
validate_config(args.model_path)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 411, in validate_config
assert (
AssertionError: Model type BlueLM not supported.
To Reproduce
Platform (e.g.Android):
Operating system (e.g. Ubuntu):
Device (e.g. android)
How you installed MLC-LLM (conda, source):
How you installed TVM-Unity (pip, source):
Python version (e.g. 3.8,3.10):
GPU driver version (if applicable):
CUDA/cuDNN version (if applicable):
TVM Unity Hash Tag (python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))", applicable if you compile models):
USE_NVTX: OFF
USE_GTEST: AUTO
SUMMARIZE: OFF
USE_IOS_RPC: OFF
USE_MSC: OFF
USE_ETHOSU:
CUDA_VERSION: NOT-FOUND
USE_LIBBACKTRACE: AUTO
DLPACK_PATH: 3rdparty/dlpack/include
USE_TENSORRT_CODEGEN: OFF
USE_THRUST: OFF
USE_TARGET_ONNX: OFF
USE_AOT_EXECUTOR: ON
BUILD_DUMMY_LIBTVM: OFF
USE_CUDNN: OFF
USE_TENSORRT_RUNTIME: OFF
USE_ARM_COMPUTE_LIB_GRAPH_EXECUTOR: OFF
USE_CCACHE: AUTO
USE_ARM_COMPUTE_LIB: OFF
USE_CPP_RTVM:
USE_OPENCL_GTEST: /path/to/opencl/gtest
USE_MKL: OFF
USE_PT_TVMDSOOP: OFF
MLIR_VERSION: NOT-FOUND
USE_CLML: OFF
USE_STACKVM_RUNTIME: OFF
USE_GRAPH_EXECUTOR_CUDA_GRAPH: OFF
ROCM_PATH: /opt/rocm
USE_DNNL: OFF
USE_VITIS_AI: OFF
USE_MLIR: OFF
USE_RCCL: OFF
USE_LLVM: llvm-config --ignore-libllvm --link-static
USE_VERILATOR: OFF
USE_TF_TVMDSOOP: OFF
USE_THREADS: ON
USE_MSVC_MT: OFF
BACKTRACE_ON_SEGFAULT: OFF
USE_GRAPH_EXECUTOR: ON
USE_NCCL: OFF
USE_ROCBLAS: OFF
GIT_COMMIT_HASH: 3001b20b0dd114cad23fccb25cbb055ce80a224e
USE_VULKAN: ON
USE_RUST_EXT: OFF
USE_CUTLASS: OFF
USE_CPP_RPC: OFF
USE_HEXAGON: OFF
USE_CUSTOM_LOGGING: OFF
USE_UMA: OFF
USE_FALLBACK_STL_MAP: OFF
USE_SORT: ON
USE_RTTI: ON
GIT_COMMIT_TIME: 2023-11-01 10:48:36 -0400
USE_HEXAGON_SDK: /path/to/sdk
USE_BLAS: none
USE_ETHOSN: OFF
USE_LIBTORCH: OFF
USE_RANDOM: ON
USE_CUDA: OFF
USE_COREML: OFF
USE_AMX: OFF
BUILD_STATIC_RUNTIME: OFF
USE_CMSISNN: OFF
USE_KHRONOS_SPIRV: OFF
USE_CLML_GRAPH_EXECUTOR: OFF
USE_TFLITE: OFF
USE_HEXAGON_GTEST: /path/to/hexagon/gtest
PICOJSON_PATH: 3rdparty/picojson
USE_OPENCL_ENABLE_HOST_PTR: OFF
INSTALL_DEV: OFF
USE_PROFILER: ON
USE_NNPACK: OFF
LLVM_VERSION: 15.0.7
USE_OPENCL: OFF
COMPILER_RT_PATH: 3rdparty/compiler-rt
RANG_PATH: 3rdparty/rang/include
USE_SPIRV_KHR_INTEGER_DOT_PRODUCT: OFF
USE_OPENMP: OFF
USE_BNNS: OFF
USE_CUBLAS: OFF
USE_METAL: OFF
USE_MICRO_STANDALONE_RUNTIME: OFF
USE_HEXAGON_EXTERNAL_LIBS: OFF
USE_ALTERNATIVE_LINKER: AUTO
USE_BYODT_POSIT: OFF
USE_HEXAGON_RPC: OFF
USE_MICRO: OFF
DMLC_PATH: 3rdparty/dmlc-core/include
INDEX_DEFAULT_I64: ON
USE_RELAY_DEBUG: OFF
USE_RPC: ON
USE_TENSORFLOW_PATH: none
TVM_CLML_VERSION:
USE_MIOPEN: OFF
USE_ROCM: OFF
USE_PAPI: OFF
USE_CURAND: OFF
TVM_CXX_COMPILER_PATH: /opt/rh/gcc-toolset-11/root/usr/bin/c++
HIDE_PRIVATE_SYMBOLS: ON
Any other relevant information:
Additional context
python3 -m mlc_llm.build --target android --max-seq-len 768 --model ./dist/models/BlueLM-7B-Chat --quantization q4f16_1
Traceback (most recent call last):
File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 47, in
main()
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 41, in main
parsed_args = core._parse_args(parsed_args) # pylint: disable=protected-access
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 325, in _parse_args
parsed = _setup_model_path(parsed)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 368, in _setup_model_path
validate_config(args.model_path)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 411, in validate_config
assert (
AssertionError: Model type BlueLM not supported.
@Vinaysukhesh98 Thanks for the report. It's not a bug but we do not support the BlueLM model. Please follow the tutorial to add support, or open a model request issue and wait for the community to support it, if someone volunteers
🐛 Bug: TRying to compile with different models
python3 -m mlc_llm.build --target android --max-seq-len 768 --model ./dist/models/BlueLM-7B-Chat --quantization q4f16_1 Traceback (most recent call last): File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 87, in _run_code exec(code, run_globals) File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 47, in
main()
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 41, in main
parsed_args = core._parse_args(parsed_args) # pylint: disable=protected-access
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 325, in _parse_args
parsed = _setup_model_path(parsed)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 368, in _setup_model_path
validate_config(args.model_path)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 411, in validate_config
assert (
AssertionError: Model type BlueLM not supported.
To Reproduce
conda
, source):pip
, source):python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))"
, applicable if you compile models):Additional context
python3 -m mlc_llm.build --target android --max-seq-len 768 --model ./dist/models/BlueLM-7B-Chat --quantization q4f16_1 Traceback (most recent call last): File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "/home/mbu-lap/miniconda3/envs/tvm-build-venv/lib/python3.8/runpy.py", line 87, in _run_code exec(code, run_globals) File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 47, in
main()
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/build.py", line 41, in main
parsed_args = core._parse_args(parsed_args) # pylint: disable=protected-access
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 325, in _parse_args
parsed = _setup_model_path(parsed)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 368, in _setup_model_path
validate_config(args.model_path)
File "/home/mbu-lap/Documents/Vinay/Mob/mlc-llm/mlc_llm/core.py", line 411, in validate_config
assert (
AssertionError: Model type BlueLM not supported.