Open macdrifter opened 1 year ago
I'm not sure what's going on there! Thanks for the report. Leaving this open in case someone else figures out a solution in the short-term.
@macdrifter, you need to copy the missing mlc lid files from https://github.com/mlc-ai/binary-mlc-llm-libs to .config/io.datasette.llm/mlc/dist/prebuilt/lib/, or wherever you store them.
Think they just added https://github.com/mlc-ai/binary-mlc-llm-libs/blob/main/WizardMath-7B-V1.0-q4f16_1-cuda.so
Yep @irthomasthomas is right just check for metal support here https://github.com/mlc-ai/binary-mlc-llm-libs
Ok so following this is pretty straightforward to compile the one you want for metal: https://mlc.ai/mlc-llm/docs/compilation/compile_models.html#
I'll drop some PRs with the models I build.
Would anyone be willing to provide more detailed instructions specific to files already being downloaded via llm mlc
?
I am specifically getting this error with Mistral 7B, and that one also has no prebuilt libs for Metal. It would be nice to be able to compile the libs directly from the already downloaded models path rather than having to do it from another directory.
@slhck Did you get unstuck?
I didn't revisit this topic since.
The examples with Llama2 in the documentation work just fine but when I tried to download and use a different MLC model I get an error. The model is shown in the list of available models so I believe it was installed correctly.
Steps to reproduce
llm mlc download-model https://huggingface.co/mlc-ai/mlc-chat-WizardMath-7B-V1.0-q4f16_1
llm mlc models
llm -m mlc-chat-WizardMath-7B-V1.0-q4f16_1 "What is the square root of 5012?"
Error message
Thank you for sharing this tool. It is by far one of the best and one of the best documented.