Open nsgrantham opened 6 months ago
I am also having this problem, exactly as described but on an Apple M2, macOS 13.6.1.
Ah, its actually running into this error:
>>> from mlc_chat.base import get_delta_message
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: cannot import name 'get_delta_message' from 'mlc_chat.base' (/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/mlc_chat/base.py)
I suspect the MLC nightly API has changed. Checking their changelog next..
Yeah that function was moved and made internal: https://github.com/mlc-ai/mlc-llm/pull/1502/files
@simonw Is there a more stable approach? I see a DeltaCallback class. Hm.
I ran into this on another platform too. Chasing the import from the PR @pamelafox dug up fixed the issue:
diff --git a/llm_mlc.py b/llm_mlc.py
index 5939e5b..3da4441 100644
--- a/llm_mlc.py
+++ b/llm_mlc.py
@@ -254,7 +254,7 @@ class MlcModel(llm.Model):
def execute(self, prompt, stream, response, conversation):
try:
import mlc_chat
- from mlc_chat.base import get_delta_message
+ from mlc_chat.callback import _get_delta_message as get_delta_message
import mlc_chat.chat_module
except ImportError:
raise click.ClickException(MLC_INSTALL)
Happy to submit a PR with that line or using the public DeltaCallback
if that'd be preferable.
Same problem. A fix would be appreciated. As of today, even with this fix, it is somewhat unclear how to get that patch in the environments that llm manages.
I ran into this on another platform too. Chasing the import from the PR @pamelafox dug up fixed the issue:
diff --git a/llm_mlc.py b/llm_mlc.py index 5939e5b..3da4441 100644 --- a/llm_mlc.py +++ b/llm_mlc.py @@ -254,7 +254,7 @@ class MlcModel(llm.Model): def execute(self, prompt, stream, response, conversation): try: import mlc_chat - from mlc_chat.base import get_delta_message + from mlc_chat.callback import _get_delta_message as get_delta_message import mlc_chat.chat_module except ImportError: raise click.ClickException(MLC_INSTALL)
Happy to submit a PR with that line or using the public
DeltaCallback
if that'd be preferable.
What was the solution to this?
In llm_mlc.py I replaced the following line:
from mlc_chat.base import get_delta_message
with
from mlc_chat.callback import _get_delta_message as get_delta_message
It didnt solve it.
I have followed the instructions in the README, in particular:
Then, I downloaded the 15 GB Llama2 model.
If I run
llm mlc models
it returnsEverything looks good.
However, if I execute
I get the following error:
This directs me back to the README instructions I followed to get here. Is
mlc_chat
installed as part of thellm mlc pip install --pre --force-reinstall mlc-ai-nightly mlc-chat-nightly -f https://mlc.ai/wheels
or am I misunderstanding? What other steps do I need to execute to installmlc_chat
?(llm v0.12, Apple M1, macOS Ventura 13.6.3)