Closed Summer-seu closed 8 months ago
change the tokenizer.decode
to tokenizer.batch_decode
Due to the lack of updates for a long time, your issue has been temporarily closed. If you still have any problems, please feel free to reopen this issue.
Thank you for sharing us with your great works! I would like to ask how to use “chat.generate” function in "demo.py" file to generate answers in batch?