mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
19.16k stars 1.57k forks source link

[Bug report] running on wsl2 also on windows? #2893

Open BlindDeveloper opened 2 months ago

BlindDeveloper commented 2 months ago

Hello, how to run mlc llm on wsl2 using cpu? I was try mlc_llmchatHF://mlc-ai/Llama-3-8B-Instruct-q4f16_1-MLC But get error. Please provide me with a command that I can copy the error text with my mouse or arrow keys, I can't do that because I'm using a screen reader.

puzzle9 commented 2 months ago

use gpu is ok cpu have error

(ai) ➜  mlc-ai python -c "import tvm; print(tvm.__file__)"
/opt/conda/envs/ai/lib/python3.12/site-packages/tvm/__init__.py
(ai) ➜  mlc-ai mlc_llm chat HF://mlc-ai/Phi-3.5-mini-instruct-q4f16_1-MLC
[2024-09-11 19:56:10] INFO auto_device.py:88: Not found device: cuda:0
[2024-09-11 19:56:11] INFO auto_device.py:88: Not found device: rocm:0
[2024-09-11 19:56:12] INFO auto_device.py:88: Not found device: metal:0
[2024-09-11 19:56:14] INFO auto_device.py:88: Not found device: vulkan:0
[2024-09-11 19:56:15] INFO auto_device.py:88: Not found device: opencl:0
[2024-09-11 19:56:15] INFO auto_device.py:33: Not found: No available device detected
Traceback (most recent call last):
  File "/opt/conda/envs/ai/bin/mlc_llm", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/opt/conda/envs/ai/lib/python3.12/site-packages/mlc_llm/__main__.py", line 45, in main
    cli.main(sys.argv[2:])
  File "/opt/conda/envs/ai/lib/python3.12/site-packages/mlc_llm/cli/chat.py", line 36, in main
    chat(
  File "/opt/conda/envs/ai/lib/python3.12/site-packages/mlc_llm/interface/chat.py", line 285, in chat
    JSONFFIEngine(
  File "/opt/conda/envs/ai/lib/python3.12/site-packages/mlc_llm/json_ffi/engine.py", line 231, in __init__
    assert isinstance(device, tvm.runtime.Device)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError
BlindDeveloper commented 2 months ago

@puzzle9 could you provide command for running on gpu?

puzzle9 commented 2 months ago

@puzzle9 could you provide command for running on gpu?

https://llm.mlc.ai/docs/install/mlc_llm.html#install-mlc-packages

i use debian sid

install nvidia-cuda-toolkit(12.1.1-2)

python -m pip install --pre -U -f https://mlc.ai/wheels mlc-llm-nightly-cu121 mlc-ai-nightly-cu121
BlindDeveloper commented 2 months ago

@puzzle9
Today tryed run on windows, and get error, looks like package isn't executable. If you have intel integrated videocard, could you test on it?

puzzle9 commented 1 month ago

@puzzle9 Today tryed run on windows, and get error, looks like package isn't executable. If you have intel integrated videocard, could you test on it?

Sorry, I don't have an Intel graphics card