chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.79k stars 5.54k forks source link

[BUG] Prompt config中的修改没有效果,已测试GLM3-6B, QWEN1.5 13B #3703

Closed XienXX closed 5 months ago

XienXX commented 6 months ago

问题描述 / Problem Description Prompt config中的修改没有效果,目前已测试GLM3-6B, QWEN1.5 13B。

d3334f1e4b568284983a19e961fd0d0 ad8f846b95d3270c949200ea0a55613

环境信息 / Environment Information

求助,请问是提示词本身的问题还是提示词修改没有接到应用层?该如何操作

dosubot[bot] commented 6 months ago

Hey there @XienXX! Great to see you back with us, diving into the digital depths of Langchain-Chatchat 🐙

It looks like the issue you're encountering with the prompt configuration not taking effect in Langchain-Chatchat v0.2.9 might be due to incorrect setup in the PROMPT_TEMPLATES dictionary within the configuration file. Ensure your template string adheres to the Jinja2 syntax and uses the correct variables for the conversation type you're working with. For instance, variables like {{ context }} or {{ question }} should only be used in conversation types where they are applicable, such as "knowledge_base_chat" and "search_engine_chat".

Here's a brief guide to help you set up your system prompt template correctly:

  1. Modify the PROMPT_TEMPLATES Dictionary: This is located in configs/prompt_config.py.example. Each key-value pair in this dictionary represents a different conversation type and its corresponding template. Make sure your template string is correctly formatted and uses the appropriate variables.

  2. Select the Appropriate Prompt Template: When calling the API interface, the system selects the prompt template based on the conversation type. Ensure you're using the correct function to get your prompt template, such as get_prompt_template("llm_chat", prompt_name) for direct input or get_prompt_template("llm_chat", "with_history") for historical messages.

  3. Check for Null or Empty Values: The "None" comment in the bug tracking system screenshot suggests there might be a null reference or an empty value problem. Ensure that the variables used in your prompt templates are correctly populated and not null or empty at runtime.

For more detailed information, you can refer to the example configuration in configs/prompt_config.py.example and the implementation in server/chat/chat.py.

If you've checked these areas and the issue persists, it might be helpful to provide more specific details about the error messages or behaviors you're observing for further assistance.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
EthanChenLi commented 6 months ago

我也没有效果

zixiaotan21 commented 6 months ago

同样遇到这个问题,我尝试了一下下面的步骤,可能可以这样解决?

  1. Langchain chatchat 启动后,WebUI界面左侧【请选择Prompt模板】那里,选择定义好的模板。
  2. 修改webui_pages\dialogue下面的dialogue.py,将143行 text = f"已切换为 {prompt_template_name} 模板。" 修改为: text = f"已切换为 {st.session_state.prompt_template_select} 模板。"

具体可以参考 https://github.com/chatchat-space/Langchain-Chatchat/issues/1827 这个issue的方法。

我修改之后,好像回答中,会出现想要的提示。

EthanChenLi commented 6 months ago

同样遇到这个问题,我尝试了一下下面的步骤,可能可以这样解决?

  1. Langchain chatchat 启动后,WebUI界面左侧【请选择Prompt模板】那里,选择定义好的模板。
  2. 修改webui_pages\dialogue下面的dialogue.py,将143行 text = f"已切换为 {prompt_template_name} 模板。" 修改为: text = f"已切换为 {st.session_state.prompt_template_select} 模板。"

具体可以参考 #1827 这个issue的方法。

我修改之后,好像回答中,会出现想要的提示。

尝试过了,依然不会出现,你那边如果会出现可能是他本身有短时记忆

zixiaotan21 commented 6 months ago

同样遇到这个问题,我尝试了一下下面的步骤,可能可以这样解决?

  1. Langchain chatchat 启动后,WebUI界面左侧【请选择Prompt模板】那里,选择定义好的模板。
  2. 修改webui_pages\dialogue下面的dialogue.py,将143行 text = f"已切换为 {prompt_template_name} 模板。" 修改为: text = f"已切换为 {st.session_state.prompt_template_select} 模板。"

具体可以参考 #1827 这个issue的方法。 我修改之后,好像回答中,会出现想要的提示。

尝试过了,依然不会出现,你那边如果会出现可能是他本身有短时记忆

确实有点奇怪~我现在尝试的时候也发现,比如说prompt模板里写“在回答末尾加上:”谢谢“。

有些回答会加上“谢谢”。问多几次,好像就不加“谢谢”了。