get_delta_message became private, so I am not sure if we want to rely on this, but as a hotfix, I felt this might be acceptable.
For a long-term solution, I guess we'd need to switch to using [StreamToStdout](https://github.com/mlc-ai/mlc-llm/blob/ac57c03ccc1ec8e9d8079d6577c5c135dd80bec0/python/mlc_chat/callback.py#L72), but I did not feel comfortable (familiar with the codebases etc.) to make a bigger change yet.
Greetings, Simon!
I noticed that the current version of
llm-mlc
does not work due to the changes made in mlc-ai, specifically in this PR: https://github.com/mlc-ai/mlc-llm/pull/1502/get_delta_message
became private, so I am not sure if we want to rely on this, but as a hotfix, I felt this might be acceptable.For a long-term solution, I guess we'd need to switch to using
[StreamToStdout](https://github.com/mlc-ai/mlc-llm/blob/ac57c03ccc1ec8e9d8079d6577c5c135dd80bec0/python/mlc_chat/callback.py#L72)
, but I did not feel comfortable (familiar with the codebases etc.) to make a bigger change yet.Happy to hear your thoughts.
Best, Rust