Open Grimmins opened 2 weeks ago
You cannot directly use the conversation template; you need to process the input according to our demo. You seem to be missing a step in the dtype conversion process. Please carefully review how the input is handled before chatting in our CLI demo.
do you have any idea how to fix the problem via the WebUI interface? just change the dtype? that shouldn't be too complicated...
System Info / 系統信息
Cuda 12.5 Trasnformers 4.41.2 Python 3.12 Ubuntu 22.04
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
Reproduction / 复现过程
Steps to reproduce (yes, you have to use the oobabooga webui, but that's not their fault - it may be a mistake on my part, in which case, it's not really an issue - ).
Here the traceback :
We can see the model is loaded and the last line said that the token has been generated. But the token is void (the answer si ""). I'm sure this is caused by the errors that appear
Expected behavior / 期待表现
have an answer to my question Q