simonw / llm-mlc

LLM plugin for running models using MLC
Apache License 2.0
179 stars 8 forks source link

Fix #21 -- update import path for mlc_chat function #22

Open amureki opened 7 months ago

amureki commented 7 months ago

Greetings, Simon!

I noticed that the current version of llm-mlc does not work due to the changes made in mlc-ai, specifically in this PR: https://github.com/mlc-ai/mlc-llm/pull/1502/

get_delta_message became private, so I am not sure if we want to rely on this, but as a hotfix, I felt this might be acceptable.

For a long-term solution, I guess we'd need to switch to using [StreamToStdout](https://github.com/mlc-ai/mlc-llm/blob/ac57c03ccc1ec8e9d8079d6577c5c135dd80bec0/python/mlc_chat/callback.py#L72), but I did not feel comfortable (familiar with the codebases etc.) to make a bigger change yet.

Happy to hear your thoughts.

Best, Rust