chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
30.84k stars 5.39k forks source link

[BUG]单独知识库访问界面和api报错KeyError: 'template' #4500

Closed ELvis168 closed 1 month ago

ELvis168 commented 1 month ago

问题描述 / Problem Description 根据样例进行单独知识库访问(/chat/chat/completions) 直接指定 tool_choice 为 "search_local_knowledgebase",再通过 tool_input 设定工具参数,即可手动调用工具,实现指定知识库对话。

base_url = "http://127.0.0.1:7861/chat"
data = {
    "messages": [
        {"role": "user", "content": "如何提问以获得高质量答案"},
    ],
    "model": "glm-4",
    "tool_choice": "search_local_knowledgebase",
    "tool_input": {"database": "samples", "query": "如何提问以获得高质量答案"},
    "stream": True,
}

import requests
response = requests.post(f"{base_url}/chat/completions", json=data, stream=True)
for line in response.iter_content(None, decode_unicode=True):
    print(line)

复现问题的步骤 / Steps to Reproduce 根据api样例进行单独知识库访问

预期的结果 / Expected Result 返回正常

实际结果 / Actual Result image

/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/langchain_core/vectorstores.py:342: UserWarning: No relevant docs were retrieved using the relevance score threshold 2.0
  warnings.warn(
INFO:     127.0.0.1:57094 - "POST /chat/chat/completions HTTP/1.1" 500 Internal Server Error
2024-07-13 19:39:53,106 httpx        26582 INFO     HTTP Request: POST http://127.0.0.1:7861/chat/chat/completions "HTTP/1.1 500 Internal Server Error"
2024-07-13 19:39:53,107 openai._base_client 26582 INFO     Retrying request to /chat/completions in 1.642228 seconds
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 396, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/routing.py", line 758, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
    await route.handle(scope, receive, send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/routing.py", line 299, in handle
    await self.app(scope, receive, send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/routing.py", line 79, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/starlette/routing.py", line 74, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
    raise e
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/chatchat/server/api_server/chat_routes.py", line 120, in chat_completions
    prompt_template = PromptTemplate.from_template(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/langchain_core/prompts/prompt.py", line 252, in from_template
    return cls(
           ^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/pydantic/v1/main.py", line 339, in __init__
    values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/pydantic/v1/main.py", line 1100, in validate_model
    values = validator(cls_, values)
             ^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/langchain3.1/lib/python3.11/site-packages/langchain_core/prompts/prompt.py", line 143, in template_is_valid
    values["template"], values["template_format"]
    ~~~~~~^^^^^^^^^^^^
KeyError: 'template'

环境信息 / Environment Information

langchain-ChatGLM 版本/commit 号:v0.3.1 是否使用 Docker 部署:否 使用的模型:glm-4 使用的 Embedding 模型: bge-large-zh-v1.5 使用的向量库类型: faiss 操作系统及版本: centos Python 版本 : 3.11

liunux4odoo commented 1 month ago

确认是 bug,PromptSettings 中调整了 rag template 的位置,代码中没有相应的修改。下个版本会修复。 当前版本可以先手动修改 prompt_settings.yaml 的 llm_model ,加入以下内容:

# 普通 LLM 用模板
llm_model:
  default: '{{input}}'
  with_history: "The following is a friendly conversation between a human and an AI.\n
    The AI is talkative and provides lots of specific details from its context.\n
    If the AI does not know the answer to a question, it truthfully says it does not
    know.\n\nCurrent conversation:\n{{history}}\nHuman: {{input}}\nAI:"
  rag: "【指令】根据已知信息,简洁和专业的来回答问题。如果无法从中得到答案,请说 “根据已知信息无法回答该问题”,不允许在答案中添加编造成分,答案请使用中文。\n\
    \n【已知信息】{{context}}\n\n【问题】{{question}}\n"
  rag_default: '{{question}}'
liunux4odoo commented 1 month ago

0.3.1.1 中已经修复,请更新安装后再试一下。