X-D-Lab / LangChain-ChatGLM-Webui

基于LangChain和ChatGLM-6B等系列LLM的针对本地知识库的自动问答
Apache License 2.0
3.14k stars 474 forks source link

加载BELLE和Vicuna模型后,提问回答报错? #97

Open emiyadavid opened 1 year ago

emiyadavid commented 1 year ago

可以正常加载chatglm-6B-int8并且正常问答,但是加载BELLE-7b和Vicuna-7b模型后,进行提问,页面出现ERROR,同时后台报错如下信息: TypeError: The current model class (LlamaModel) is not compatible with .generate(), as it doesn't have a language model head. Please use one of the following classes instead: {'LlamaForCausalLM'}

代码断点定位在KnowledgeBasedChatLLM类的get_knowledge_based_answer函数的这一句上 result = knowledge_chain({"query": query})

Yanllan commented 8 months ago

根据报错,似乎是Vicuna-7b模型不支持.generate()方法,可以的话请换一个例如chat-gpt 7B模型再尝试一下 推荐使用生成式LLM

emiyadavid commented 8 months ago

好的,我已收到!