Closed TNT3530 closed 2 months ago
TVM should find the LLVM ld.lld file
When running mlc_llm chat with JIT compiling on, TVM fails to find the LLVM installation, throwing RuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']
mlc_llm chat
RuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']
Testing in an MLC docker container with fresh installs of nightly
run mlc_llm chat HF://<model>, it will download the model, compile it, then crash when saving the .so file
mlc_llm chat HF://<model>
Line 55 of https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/rocm.py incorrectly forgets to add ld.lld (or whatever it finds in the lines above) to the /opt/rocm/llvm/bin path, which then returns None since os.path.isfile in https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/utils.py#L253 returns False when pointed at directories.
ld.lld
/opt/rocm/llvm/bin
None
os.path.isfile
should be fixed upstream
Expected behavior
TVM should find the LLVM ld.lld file
Actual behavior
When running
mlc_llm chat
with JIT compiling on, TVM fails to find the LLVM installation, throwingRuntimeError: cannot find ld.lld, canditates are: ['ld.lld-17.0', 'ld.lld-17', 'ld.lld', '/opt/rocm/llvm/bin']
Environment
Testing in an MLC docker container with fresh installs of nightly
Steps to reproduce
run
mlc_llm chat HF://<model>
, it will download the model, compile it, then crash when saving the .so fileTriage
Line 55 of https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/rocm.py incorrectly forgets to add
ld.lld
(or whatever it finds in the lines above) to the/opt/rocm/llvm/bin
path, which then returnsNone
sinceos.path.isfile
in https://github.com/mlc-ai/relax/blob/mlc/python/tvm/contrib/utils.py#L253 returns False when pointed at directories.