Closed HaoRanLyu closed 5 months ago
Thank you for attempting to deploy VideoChat locally. Before deploying the model, please ensure that the weights of LLM have been converted.
Due to the lack of updates for a long time, your issue has been temporarily closed. If you still have any problems, please feel free to reopen this issue.
您好,我尝试在本地部署 video chat 模型,使用的是 llama 13b,但是在运行起来做问答的时候模型会输出乱码,请问是什么情况?