Closed simonw closed 1 year ago
I can't figure out how to implement conversation mode where conversations are loaded from SQLite first - I opened an issue question about that here:
I'll skip conversation support for the first release.
https://github.com/mlc-ai/notebooks/blob/main/mlc-llm/tutorial_chat_module_getting_started.ipynb is a useful tutorial.
These instructions worked for getting it installed: https://mlc.ai/mlc-llm/docs/get_started/try_out.html#get-started
Installing the right package for the M1/M2 requires this command:
pip install --pre --force-reinstall \
mlc-ai-nightly \
mlc-chat-nightly \
-f https://mlc.ai/wheels
The llm install
command doesn't yet support all of those options.
I'm going to add a llm mlc pip
command as a temporary workaround for that, to ensure people install things in the correct virtual environment:
llm mlc pip install --pre --force-reinstall \
mlc-ai-nightly \
mlc-chat-nightly \
-f https://mlc.ai/wheels
It's going to let you install and run models using https://github.com/mlc-ai/mlc-llm
e.g.