mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
18.58k stars 1.5k forks source link

[Question] Unable to install mlc-llm - error: module 'tvm.script.parser.tir' has no attribute 'bitwise_and' #832

Closed qizzzh closed 1 year ago

qizzzh commented 1 year ago

Steps I followed:

---- > Traceback (most recent call last): File "/usr/local/Cellar/python@3.10/3.10.9/Frameworks/Python.framework/Versions/3.10/lib/python3.10/runpy.py", line 187, in _run_module_as_main mod_name, mod_spec, code = _get_module_details(mod_name, _Error) File "/usr/local/Cellar/python@3.10/3.10.9/Frameworks/Python.framework/Versions/3.10/lib/python3.10/runpy.py", line 110, in _get_module_details import(pkg_name) File "/Users/qzhou/mlc-llm/mlc_llm/init.py", line 1, in from . import dispatch File "/Users/qzhou/mlc-llm/mlc_llm/dispatch/init.py", line 1, in from .dispatch_tir_operator import DispatchTIROperator File "/Users/qzhou/mlc-llm/mlc_llm/dispatch/dispatch_tir_operator.py", line 2, in import tvm ModuleNotFoundError: No module named 'tvm'

---- > error: module 'tvm.script.parser.tir' has no attribute 'bitwise_and' --> /home/qzhou/mlc-llm/mlc_llm/dispatch/dispatch_tir_operator_adreno.py:26:21 | 26 | T.bitwise_and( | ^^^^^^^^^^^^^ note: run with TVM_BACKTRACE=1 environment variable to display a backtrace.

pip3 list | grep tvm apache-tvm 0.11.1

pip3 list | grep mlc mlc-llm 0.1.dev391+g2ae8907

qizzzh commented 1 year ago

Installed a newer version tvm and now hit a different issue

pip3 install apache-tvm==0.14.dev148

python3 -m mlc_llm.build --help Traceback (most recent call last): File "/usr/lib/python3.8/runpy.py", line 185, in _run_module_as_main mod_name, mod_spec, code = _get_module_details(mod_name, _Error) File "/usr/lib/python3.8/runpy.py", line 111, in _get_module_details import(pkg_name) File "/home/qzhou/.local/lib/python3.8/site-packages/mlc_llm/init.py", line 2, in from . import quantization File "/home/qzhou/.local/lib/python3.8/site-packages/mlc_llm/quantization/init.py", line 1, in from .quantization import FQuantize File "/home/qzhou/.local/lib/python3.8/site-packages/mlc_llm/quantization/quantization.py", line 6, in from tvm import relax, te ImportError: cannot import name 'relax' from 'tvm' (/home/qzhou/.local/lib/python3.8/site-packages/tvm/init.py)

qizzzh commented 1 year ago

Tried a few other versions. No luck.

aowen14 commented 1 year ago

Like the OP, I got: Traceback (most recent call last): File "/usr/lib/python3.8/runpy.py", line 185, in _run_module_as_main mod_name, mod_spec, code = _get_module_details(mod_name, _Error) File "/usr/lib/python3.8/runpy.py", line 111, in _get_module_details import(pkg_name) File "/home/lambda1/AlexCode/mlc-llm/mlc_llm/init.py", line 1, in from . import dispatch File "/home/lambda1/AlexCode/mlc-llm/mlc_llm/dispatch/init.py", line 1, in from .dispatch_tir_operator import DispatchTIROperator File "/home/lambda1/AlexCode/mlc-llm/mlc_llm/dispatch/dispatch_tir_operator.py", line 2, in import tvm after running the steps: git clone --recursive https://github.com/mlc-ai/mlc-llm.git cd mlc-llm pip install . python3 -m mlc_llm.build --help

Also got this when using the conda install steps available here: https://mlc.ai/package/ conda create -n mlc-chat-venv -c mlc-ai -c conda-forge mlc-chat-nightly conda activate mlc-chat-venv

MasterJH5574 commented 1 year ago

Hello folks, I don't think it's necessary to run pip install .. We will examine if the instruction is redundant. Just cloning the repo is fine.

Meanwhile, to install TVM, we recommend use the pip instructions as listed in https://mlc.ai/package/ to install the TVM nightly build.

qizzzh commented 1 year ago

The pip install . is in the instructions.

https://mlc.ai/package/ doesn't work for me, same issue as https://github.com/mlc-ai/mlc-llm/issues/803.

qizzzh commented 1 year ago

@MasterJH5574 any idea on how to fix it? I'm not able to run mlc-llm.

Hzfengsy commented 1 year ago

Sorry, we are not experts in PyPI packaging. We provide x86_64 Linux wheels, and we do not know why it fails on some devices.

We are more than happy to help build new wheels if you can provide the instructions :)

qizzzh commented 1 year ago

Is there any fix pushed? Apparently I'm not the only one hitting the issue.

Hzfengsy commented 1 year ago

The original issue is resolved I think, as you are using the wrong package (apache-tvm vs. mlc wheels).

As for the PyPi issue, another issue https://github.com/mlc-ai/mlc-llm/issues/803 is still open.

Meanwhile, I recommend you compile MLC wheels from the source if possible. Feel free to open another issue if you face problems during compiling

ZhenyuYangGithub commented 10 months ago

Is there any fix pushed? Apparently I'm not the only one hitting the issue.

Did you solve the problem?

ZhenyuYangGithub commented 10 months ago

The original issue is resolved I think, as you are using the wrong package (apache-tvm vs. mlc wheels).

As for the PyPi issue, another issue #803 is still open.

Meanwhile, I recommend you compile MLC wheels from the source if possible. Feel free to open another issue if you face problems during compiling

How do we solve the original issue?

junrushao commented 10 months ago

This particular issue means you guys are not installing a correct pypi package…I don’t know who released this apache-tvm package but it’s apparently not right.

Please refer to our instructions for more details on how to install TVM Unity: https://llm.mlc.ai/docs/install/tvm.html#option-1-prebuilt-package