mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
19.03k stars 1.56k forks source link

[Bug] Installation of mlc with Llama-2-7b-chat on Intel MacBook Pro 16 fails with error "incompatible architecture" #879

Closed sailingcai closed 1 year ago

sailingcai commented 1 year ago

πŸ› Bug

After the installation is complete run llm -m Llama-2-7b-chat 'five names for a cute pet ferret'

Error messages show: Error: Traceback (most recent call last): Traceback (most recent call last): Traceback Error: Traceback (most recent call last). File "/Users/runner/work/package/package/tvm/src/runtime/dso_library.cc", line 125 InternalError: Check failed: (lib_handle_ ! = nullptr) is false: Failed to load dynamic shared library /Users/sailing/Library/Application Support/io.datasette.llm/mlc/dist/prebuilt/lib/ Llama-2-7b-chat-hf-q4f16_1-metal.so dlopen(/Users/sailing/Library/Application Support/io.datasette.llm/mlc/dist/prebuilt/lib/Llama-2 -7b-chat-hf-q4f16_1-metal.so, 0x0005): tried: '/Users/sailing/Library/Application Support/io.datasette.llm/mlc/dist/prebuilt/lib/Llama -2-7b-chat-hf-q4f16_1-metal.so' (mach-o file, but is an incompatible architecture (has 'arm64', needs 'x86_64h' or 'x86_64')), '/System/Volumes/ Preboot/Cryptexes/OS/Users/sailing/Library/Application Support/io.datasette.llm/mlc/dist/prebuilt/lib/Llama-2-7b-chat-hf-q4f16_1- metal.so' (no such file), '/Users/sailing/Library/Application Support/io.datasette.llm/mlc/dist/prebuilt/lib/Llama-2-7b-chat-hf-q4f16_1- -metal.so' (machinist/metal.so' (machinist/metal.so), '/Users/sailing/Library/Application Support/io. -metal.so' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64'))

To Reproduce

Steps to reproduce the behavior:

  1. pip install llm
  2. llm install llm-mlc
  3. llm-mlc % llm mlc pip install --pre --force-reinstall \ mlc-ai-nightly \ mlc-chat-nightly \ -f https://mlc.ai/wheels
  4. brew install git-lfs
  5. llm mlc setup
  6. llm mlc download-model Llama-2-7b-chat --alias llama2

llm mlc pip install mlc-ai-nightly and mlc-chat-nightly logs as below: Looking in links: https://mlc.ai/wheels Collecting mlc-ai-nightly Downloading https://github.com/mlc-ai/package/releases/download/v0.9.dev0/mlc_ai_nightly-0.12.dev1454-cp310-cp310-macosx_10_15_x86_64.whl (74.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 74.9/74.9 MB 1.4 MB/s eta 0:00:00 Collecting mlc-chat-nightly Downloading https://github.com/mlc-ai/package/releases/download/v0.9.dev0/mlc_chat_nightly-0.1.dev403-cp310-cp310-macosx_10_15_x86_64.whl (7.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.0/7.0 MB 1.1 MB/s eta 0:00:00 Collecting tornado Using cached tornado-6.3.3-cp38-abi3-macosx_10_9_x86_64.whl (423 kB) Collecting typing-extensions Using cached typing_extensions-4.7.1-py3-none-any.whl (33 kB) Collecting decorator Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB) Collecting numpy Downloading numpy-1.26.0b1-cp310-cp310-macosx_10_9_x86_64.whl (21.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.5/21.5 MB 2.7 MB/s eta 0:00:00 Collecting psutil Using cached psutil-5.9.5-cp36-abi3-macosx_10_9_x86_64.whl (245 kB) Collecting ml-dtypes Downloading ml_dtypes-0.2.0-cp310-cp310-macosx_10_9_universal2.whl (1.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 900.8 kB/s eta 0:00:00 Collecting scipy Downloading scipy-1.11.2-cp310-cp310-macosx_10_9_x86_64.whl (37.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 37.2/37.2 MB 3.1 MB/s eta 0:00:00 Collecting attrs Using cached attrs-23.1.0-py3-none-any.whl (61 kB) Collecting cloudpickle Using cached cloudpickle-2.2.1-py3-none-any.whl (25 kB) Collecting uvicorn Using cached uvicorn-0.23.2-py3-none-any.whl (59 kB) Collecting shortuuid Downloading shortuuid-1.0.11-py3-none-any.whl (10 kB) Collecting fastapi Using cached fastapi-0.103.1-py3-none-any.whl (66 kB) Collecting starlette<0.28.0,>=0.27.0 Using cached starlette-0.27.0-py3-none-any.whl (66 kB) Collecting pydantic!=1.8,!=1.8.1,!=2.0.0,!=2.0.1,!=2.1.0,<3.0.0,>=1.7.4 Using cached pydantic-2.3.0-py3-none-any.whl (374 kB) Collecting anyio<4.0.0,>=3.7.1 Using cached anyio-3.7.1-py3-none-any.whl (80 kB) Collecting click>=7.0 Using cached click-8.1.7-py3-none-any.whl (97 kB) Collecting h11>=0.8 Using cached h11-0.14.0-py3-none-any.whl (58 kB) Collecting exceptiongroup Using cached exceptiongroup-1.1.3-py3-none-any.whl (14 kB) Collecting idna>=2.8 Using cached idna-3.4-py3-none-any.whl (61 kB) Collecting sniffio>=1.1 Using cached sniffio-1.3.0-py3-none-any.whl (10 kB) Collecting pydantic-core==2.6.3 Using cached pydantic_core-2.6.3-cp310-cp310-macosx_10_7_x86_64.whl (1.7 MB) Collecting annotated-types>=0.4.0 Using cached annotated_types-0.5.0-py3-none-any.whl (11 kB) Installing collected packages: typing-extensions, tornado, sniffio, shortuuid, psutil, numpy, idna, h11, exceptiongroup, decorator, cloudpickle, click, attrs, annotated-types, uvicorn, scipy, pydantic-core, ml-dtypes, anyio, starlette, pydantic, mlc-ai-nightly, fastapi, mlc-chat-nightly Attempting uninstall: typing-extensions Found existing installation: typing_extensions 4.7.1 Uninstalling typing_extensions-4.7.1: Successfully uninstalled typing_extensions-4.7.1 Attempting uninstall: sniffio Found existing installation: sniffio 1.3.0 Uninstalling sniffio-1.3.0: Successfully uninstalled sniffio-1.3.0 Attempting uninstall: idna Found existing installation: idna 3.4 Uninstalling idna-3.4: Successfully uninstalled idna-3.4 Attempting uninstall: h11 Found existing installation: h11 0.14.0 Uninstalling h11-0.14.0: Successfully uninstalled h11-0.14.0 Attempting uninstall: exceptiongroup Found existing installation: exceptiongroup 1.1.3 Uninstalling exceptiongroup-1.1.3: Successfully uninstalled exceptiongroup-1.1.3 Attempting uninstall: click Found existing installation: click 8.1.7 Uninstalling click-8.1.7: Successfully uninstalled click-8.1.7 Attempting uninstall: attrs Found existing installation: attrs 23.1.0 Uninstalling attrs-23.1.0: Successfully uninstalled attrs-23.1.0 Attempting uninstall: annotated-types Found existing installation: annotated-types 0.5.0 Uninstalling annotated-types-0.5.0: Successfully uninstalled annotated-types-0.5.0 Attempting uninstall: pydantic-core Found existing installation: pydantic_core 2.6.3 Uninstalling pydantic_core-2.6.3: Successfully uninstalled pydantic_core-2.6.3 Attempting uninstall: anyio Found existing installation: anyio 4.0.0 Uninstalling anyio-4.0.0: Successfully uninstalled anyio-4.0.0 Attempting uninstall: pydantic Found existing installation: pydantic 2.3.0 Uninstalling pydantic-2.3.0: Successfully uninstalled pydantic-2.3.0 Successfully installed annotated-types-0.5.0 anyio-3.7.1 attrs-23.1.0 click-8.1.7 cloudpickle-2.2.1 decorator-5.1.1 exceptiongroup-1.1.3 fastapi-0.103.1 h11-0.14.0 idna-3.4 ml-dtypes-0.2.0 mlc-ai-nightly-0.12.dev1454 mlc-chat-nightly-0.1.dev403 numpy-1.26.0b1 psutil-5.9.5 pydantic-2.3.0 pydantic-core-2.6.3 scipy-1.11.2 shortuuid-1.0.11 sniffio-1.3.0 starlette-0.27.0 tornado-6.3.3 typing-extensions-4.7.1 uvicorn-0.23.2

Expected behavior

Llama-2-7b-chat model can run properly in Metal acceleration mode.

Environment

Hzfengsy commented 1 year ago

x86_64 is supported with another suffix metal_x86_64, see the prebuilt libs here

lucaswiman commented 1 year ago

@Hzfengsy is there a command line flag or environment variableI can use to install it? I tried installing with this command that still fails:

 llm mlc download-model https://huggingface.co/mlc-ai/mlc-chat-Llama-2-7b-chat-hf-q4f16_1-metal_x86_64 --alias llama2