wenda-LLM / wenda

闻达:一个LLM调用平台。目标为针对特定环境的高效内容生成,同时考虑个人和中小企业的计算资源局限性,以及知识安全和私密性问题
GNU Affero General Public License v3.0
6.23k stars 810 forks source link

千问模型报错 #472

Open kknd222 opened 1 year ago

kknd222 commented 1 year ago

打开后报错 错误Pass argument stream to model.chat() is buggy, deprecated, and marked for removal. Please use model.chat_stream(...) instead of model.chat(..., stream=True). 向model.chat()传入参数stream的用法可能存在Bug,该用法已被废弃,将在未来被移除。请使用model.chat_stream(...)代替model.chat(..., stream=True)。

修改llms\llm_qwen.py中的for response in model.chat(tokenizer, prompt, history=history, stream=True):为for response in model.chat_stream(tokenizer, prompt, history=history): 就可以了

shen5455 commented 1 year ago

或者把 model.chat 改成 model.chat_stream

hahajinghuayuan commented 11 months ago

同问。qwen-14b-chat加载知识库模式后,无法搜索到知识。但是换chatglm2-6b就可以怎么回事?