Open sreejith-ios opened 1 week ago
Hello, documentation for linux https://llm.mlc.ai/docs/install/mlc_llm.html#install-mlc-packages Contain recomendation using python 3.11. Does it helps?
Hi @BlindDeveloper I am using a intel arc gpu on windows 10 machine.
Hi @BlindDeveloper I have official documentation from mlc-llm.
@sreejith-ios If you try lounch mlc llm on your windows computer using python 3.11 The ,bug is still present?
Error: Using LLVM 19.1.1 with -mcpu=apple-latest is not valid in -mtriple=arm64-apple-macos, using default -mcpu=generic.
I wonder why you are using apple and macos as mcpu and mtriple.
What's your command to compile or run the model
I am using mlc_llm chat MODEL [--model-lib PATH-TO-MODEL-LIB] command from official documentation of mlc-llm for llm inference after converting HF downloaded model to mlc-format using https://llm.mlc.ai/docs/compilation/convert_weights.html
https://llm.mlc.ai/docs/deploy/cli.html#id2 I get the above error on my CLI which is I copied on the ticket.
I am encountering issues while trying to access the GPU for LLM inferencing with mlc-llm on Windows.
To Reproduce
Error Messages
Error: Using LLVM 19.1.1 with -mcpu=apple-latest is not valid in -mtriple=arm64-apple-macos, using default -mcpu=generic.
Expected behavior
I expected to access the GPU for LLM inferencing without encountering configuration-related errors.
Environment
conda
pip
dc87019cb805d0a1f0075f6415cc979ef337ec2a
Additional context
I have verified that llvm-config.exe is accessible and the version shows correctly. Despite setting the target platform explicitly to Vulkan and ensuring all packages and dependencies are updated, I still encounter this issue when trying to access the GPU.