simonw / llm-mlc

LLM plugin for running models using MLC
Apache License 2.0
179 stars 8 forks source link

llm mlc setup complained that it needed mlc_chat #6

Closed zellyn closed 1 year ago

zellyn commented 1 year ago

In the install instructions, mlc_chat comes after llm mlc setup, but I had to do the mlc_chat installation step before the setup would work

~/ve llm mlc setup       
Downloading prebuilt binaries...
Cloning into '/Users/zellyn/Library/Application Support/io.datasette.llm/mlc/dist/prebuilt/lib'...
remote: Enumerating objects: 221, done.
remote: Counting objects: 100% (86/86), done.
remote: Compressing objects: 100% (54/54), done.
remote: Total 221 (delta 59), reused 56 (delta 32), pack-reused 135
Receiving objects: 100% (221/221), 52.06 MiB | 10.44 MiB/s, done.
Resolving deltas: 100% (152/152), done.
Ready to install models in /Users/zellyn/Library/Application Support/io.datasette.llm/mlc
Error: You must install mlc_chat first. See https://github.com/simonw/llm-mlc for instructions.
~/ve llm install mlc_chat
simonw commented 1 year ago

Oh I see - yeah I'll change the order of the README.

simonw commented 1 year ago

OK, https://github.com/simonw/llm-mlc/blob/be63ebd95dfc8928415bdc0261db1a8249f7dfff/README.md#installation is much better now, thanks!