chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.83k stars 5.55k forks source link

[BUG] 简洁阐述问题 / 使用api运行chatglm3报错 #2722

Closed WangShengzhao closed 9 months ago

WangShengzhao commented 9 months ago

ERROR | root | Error code: 400 - {'object': 'error', 'message': 'Only zhipu-api allowed now, your model chatglm3-6b', 'code': 40301} Traceback (most recent call last): File "/home/wsz/Langchain-Chatchat-master/server/utils.py", line 23, in wrap_done await fn File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain/chains/base.py", line 385, in acall raise e File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain/chains/base.py", line 379, in acall await self._acall(inputs, run_manager=run_manager) File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain/chains/llm.py", line 275, in _acall response = await self.agenerate([inputs], run_manager=run_manager) File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain/chains/llm.py", line 142, in agenerate return await self.llm.agenerate_prompt( File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 553, in agenerate_prompt return await self.agenerate( File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 513, in agenerate raise exceptions[0] File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 616, in _agenerate_with_cache return await self._agenerate( File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain_community/chat_models/openai.py", line 522, in _agenerate return await agenerate_from_stream(stream_iter) File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 86, in agenerate_from_stream async for chunk in stream: File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain_community/chat_models/openai.py", line 488, in _astream async for chunk in await acompletion_with_retry( File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/langchain_community/chat_models/openai.py", line 105, in acompletion_with_retry return await llm.async_client.create(**kwargs) File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1291, in create return await self._post( File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/openai/_base_client.py", line 1578, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/openai/_base_client.py", line 1339, in request return await self._request( File "/home/wsz/anaconda3/envs/py310chat/lib/python3.10/site-packages/openai/_base_client.py", line 1429, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Only zhipu-api allowed now, your model chatglm3-6b', 'code': 40301} 2024-01-19 16:11:59 | ERROR | root | BadRequestError: Caught exception: Error code: 400 - {'object': 'error', 'message': 'Only zhipu-api allowed now, your model chatglm3-6b', 'code': 40301}

dosubot[bot] commented 9 months ago

🤖

你好,@WangShengzhao!好久不见,希望一切都好。感谢你把这个问题带给我们。

您收到错误代码400,信息为'Only zhipu-api allowed now, your model chatglm3-6b',这可能是因为Langchain-Chatchat应用程序中的ChatGLMWorker类的当前实现只支持'zhipu-api'模型。这可以在ChatGLMWorker类的__init__方法中看到,其中默认模型名称设置为'zhipu-api':

class ChatGLMWorker(ApiModelWorker):
    DEFAULT_EMBED_MODEL = "text_embedding"

    def __init__(
        self,
        *,
        model_names: List[str] = ["zhipu-api"],
        controller_addr: str = None,
        worker_addr: str = None,
        version: Literal["chatglm_turbo"] = "chatglm_turbo",
        **kwargs,
    ):
        kwargs.update(model_names=model_names, controller_addr=controller_addr, worker_addr=worker_addr)
        kwargs.setdefault("context_len", 32768)
        super().__init__(**kwargs)
        self.version = version

在这段代码中,model_names参数默认设置为只包含'zhipu-api'的列表。这意味着如果您尝试使用不同的模型,如'chatglm3-6b',应用程序将无法识别它并将返回错误。

要解决此问题,您需要修改代码以支持'chatglm3-6b'模型,或者您希望使用的任何其他模型。这可能涉及将'chatglm3-6b'添加到__init__方法中的模型名称列表中,或者修改代码的其他部分以确保与'chatglm3-6b'模型的兼容性。

关于'zhipu-api'模型在Langchain-Chatchat应用程序中的具体作用,它主要用于两个目的:

  1. 聊天:do_chat方法使用'zhipu-api'模型在聊天中生成响应。它通过调用zhipuapi.model_apisse_invoke方法并使用聊天参数来实现这一点。模型的响应然后作为包含文本和错误代码的字典生成。

  2. 生成嵌入:do_embeddings方法使用'zhipu-api'模型为给定文本生成嵌入。它通过调用zhipuapi.model_apiinvoke方法并使用文本和用于生成嵌入的模型来实现这一点。嵌入然后作为包含嵌入和状态代码的字典返回。

这是从代码中得到的支持证据:

对于聊天:

response = zhipuai.model_api.sse_invoke(
    model=params.version,
    prompt=params.messages,
    temperature=params.temperature,
    top_p=params.top_p,
    incremental=False,
)

对于生成嵌入:

response = zhipuai.model_api.invoke(model=params.embed_model or self.DEFAULT_EMBED_MODEL, prompt=t)

源代码:

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

zRzRzRzRzRzRzR commented 9 months ago

glm要重写了,现在他们的库版本太高了不兼容了

Autism-al commented 6 months ago

请问这个问题您解决了吗,我今天也遇到了同样的问题,尝试了很长时间都没法解决