chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.29k stars 5.45k forks source link

Chat Chat API server 的对话历史可以删去或者修改为特定内容吗 #3621

Closed zixiaotan21 closed 4 months ago

zixiaotan21 commented 5 months ago

问题描述 / Problem Description Chat Chat API server 的对话历史可以删去或者修改为特定内容吗

复现问题的步骤 / Steps to Reproduce

  1. 点击 Chatchat API Server: http://127.0.0.1:7861
  2. 在request body 中: { "query": "RCEP证书可以自助打印吗", "knowledge_base_name": "text_3", "top_k": 3, "score_threshold": 1, "history": [ { "role": "user", "content": "我们来玩成语接龙,我先来,生龙活虎" }, { "role": "assistant", "content": "虎头虎脑" } ], "stream": false, "model_name": "qwen-api", "temperature": 0.7, "max_tokens": 0, "prompt_name": "default" }
  3. 这里的history部分 可以删去 或者自定义修改吗
dosubot[bot] commented 5 months ago

嘿,@zixiaotan21!很高兴再次见到你。希望你一切都好!

Yes, there is a way to delete or alter the "history" part of the conversation when working with the Chat API server. The history parameter in your request body can be directly manipulated before it is used to construct the chat prompt.

The history parameter is expected to be a list of History objects or an integer. If it's a list, each History object in the list can be modified or removed according to your requirements before the chat prompt is constructed. This manipulation can be done right after the history parameter is checked and before it is used to create the chat_prompt.

Here's a simplified example based on your provided code, demonstrating how you could alter the history:

if history:  # If history is provided
    # Convert each item in history to a History object
    history = [History.from_data(h) for h in history]

    # Example: Remove the last item from the history list
    history.pop()

    # Or alter an item in the history
    # Example: Change the content of the first history item
    if history:
        history[0].content = "Altered content"

    # Proceed to use the altered history to construct the chat prompt
    prompt_template = get_prompt_template("llm_chat", prompt_name)
    input_msg = History(role="user", content=prompt_template).to_msg_template(False)
    chat_prompt = ChatPromptTemplate.from_messages(
        [i.to_msg_template() for i in history] + [input_msg])

This example demonstrates how to delete the last item from the history list and how to alter the content of the first history item. Similar manipulations can be applied based on your specific requirements.

You can find more details about the History object and its usage in the Langchain-Chatchat codebase in the server/chat/utils.py and server/chat/chat.py files.

Please note that altering the history might affect the responses generated by the model, as the model uses the history to understand the context of the conversation. Therefore, any changes to the history should be made with consideration of how they might impact the model's performance.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
Lu0Key commented 5 months ago

你可以设为一个空数组,或者自己去修改内容,只是要符合格式{”role“:xxx,"content":xxx}

zixiaotan21 commented 5 months ago

你可以设为一个空数组,或者自己去修改内容,只是要符合格式{”role“:xxx,"content":xxx}

嗯嗯,我明白了,谢谢!

github-actions[bot] commented 4 months ago

这个问题已经被标记为 stale ,因为它已经超过 30 天没有任何活动。