Closed XienXX closed 5 months ago
Hey there @XienXX! Great to see you back with us, diving into the digital depths of Langchain-Chatchat 🐙
It looks like the issue you're encountering with the prompt configuration not taking effect in Langchain-Chatchat v0.2.9 might be due to incorrect setup in the PROMPT_TEMPLATES
dictionary within the configuration file. Ensure your template string adheres to the Jinja2 syntax and uses the correct variables for the conversation type you're working with. For instance, variables like {{ context }}
or {{ question }}
should only be used in conversation types where they are applicable, such as "knowledge_base_chat" and "search_engine_chat".
Here's a brief guide to help you set up your system prompt template correctly:
Modify the PROMPT_TEMPLATES
Dictionary: This is located in configs/prompt_config.py.example
. Each key-value pair in this dictionary represents a different conversation type and its corresponding template. Make sure your template string is correctly formatted and uses the appropriate variables.
Select the Appropriate Prompt Template: When calling the API interface, the system selects the prompt template based on the conversation type. Ensure you're using the correct function to get your prompt template, such as get_prompt_template("llm_chat", prompt_name)
for direct input or get_prompt_template("llm_chat", "with_history")
for historical messages.
Check for Null or Empty Values: The "None" comment in the bug tracking system screenshot suggests there might be a null reference or an empty value problem. Ensure that the variables used in your prompt templates are correctly populated and not null or empty at runtime.
For more detailed information, you can refer to the example configuration in configs/prompt_config.py.example and the implementation in server/chat/chat.py.
If you've checked these areas and the issue persists, it might be helpful to provide more specific details about the error messages or behaviors you're observing for further assistance.
我也没有效果
同样遇到这个问题,我尝试了一下下面的步骤,可能可以这样解决?
具体可以参考 https://github.com/chatchat-space/Langchain-Chatchat/issues/1827 这个issue的方法。
我修改之后,好像回答中,会出现想要的提示。
同样遇到这个问题,我尝试了一下下面的步骤,可能可以这样解决?
- Langchain chatchat 启动后,WebUI界面左侧【请选择Prompt模板】那里,选择定义好的模板。
- 修改webui_pages\dialogue下面的dialogue.py,将143行 text = f"已切换为 {prompt_template_name} 模板。" 修改为: text = f"已切换为 {st.session_state.prompt_template_select} 模板。"
具体可以参考 #1827 这个issue的方法。
我修改之后,好像回答中,会出现想要的提示。
尝试过了,依然不会出现,你那边如果会出现可能是他本身有短时记忆
同样遇到这个问题,我尝试了一下下面的步骤,可能可以这样解决?
- Langchain chatchat 启动后,WebUI界面左侧【请选择Prompt模板】那里,选择定义好的模板。
- 修改webui_pages\dialogue下面的dialogue.py,将143行 text = f"已切换为 {prompt_template_name} 模板。" 修改为: text = f"已切换为 {st.session_state.prompt_template_select} 模板。"
具体可以参考 #1827 这个issue的方法。 我修改之后,好像回答中,会出现想要的提示。
尝试过了,依然不会出现,你那边如果会出现可能是他本身有短时记忆
确实有点奇怪~我现在尝试的时候也发现,比如说prompt模板里写“在回答末尾加上:”谢谢“。
有些回答会加上“谢谢”。问多几次,好像就不加“谢谢”了。
问题描述 / Problem Description Prompt config中的修改没有效果,目前已测试GLM3-6B, QWEN1.5 13B。
环境信息 / Environment Information
求助,请问是提示词本身的问题还是提示词修改没有接到应用层?该如何操作