Closed varshith15 closed 11 months ago
Hi @varshith15, thank you for reporting the issue! I was able to reproduce it on my end. Confirmed that https://github.com/apache/tvm/pull/16132 is able to fix this as there is an issue when we reset the chat currently.
@CharlieFRuan the issue persists even without chat reset in the loop, maybe because I use LM conv_template?
Oh yes I can reproduce that as well; also able to be fixed by that PR; LM conv_template should be fine. I didn't notice that we call reset_chat()
in generate()
as well.
The PR should be merged in soon! You could build from source at the meantime if in a hurry.
Ah got it, yeah reset_chat() wasn't part of generate() before, Thanks!
Hi @varshith15, the fix should be included in https://github.com/mlc-ai/relax now, let me know if there are other issues.
@CharlieFRuan it works fine now, thanks!
š Bug
running cm.generate(prompt=text) in a loop for mistral model is giving the following error
To Reproduce
Steps to reproduce the behavior:
mlc-chat-config used
Environment
conda
, source):pip
, source):TVM Unity Hash Tag (
python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))"
, applicable if you compile models):@davidpissarra @CharlieFRuan any idea why this is happening?