chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.64k stars 5.52k forks source link

点击WebUI 界面右上角清楚缓存报错 #3592

Closed zixiaotan21 closed 5 months ago

zixiaotan21 commented 6 months ago

问题描述 / Problem Description 启动Langchain-Chatchat后,点击WebUI 界面右上角清楚缓存报错

复现问题的步骤 / Steps to Reproduce

  1. 启动Langchain-Chatchat
  2. 进入WebUI 界面
  3. 点击WebUI 界面右上角清楚缓存报错

预期的结果 / Expected Result 不会报错

实际结果 / Actual Result AttributeError: st.session_state has no attribute "dialogue_mode". Did you forget to initialize it? More info: https://docs.streamlit.io/library/advanced-features/session-state#initialization Traceback: File "C:\Users\butterfly\anaconda3\envs\chatchat_env\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 531, in _run_script self._session_state.on_script_will_rerun(rerun_data.widget_states) File "C:\Users\butterfly\anaconda3\envs\chatchat_env\Lib\site-packages\streamlit\runtime\state\safe_session_state.py", line 63, in on_script_will_rerun self._state.on_script_will_rerun(latest_widget_states) File "C:\Users\butterfly\anaconda3\envs\chatchat_env\Lib\site-packages\streamlit\runtime\state\session_state.py", line 504, in on_script_will_rerun self._call_callbacks() File "C:\Users\butterfly\anaconda3\envs\chatchat_env\Lib\site-packages\streamlit\runtime\state\session_state.py", line 517, in _call_callbacks self._new_widget_state.call_callback(wid) File "C:\Users\butterfly\anaconda3\envs\chatchat_env\Lib\site-packages\streamlit\runtime\state\session_state.py", line 261, in call_callback callback(*args, **kwargs) File "C:\Users\butterfly\Langchain-Chatchat\webui_pages\dialogue\dialogue.py", line 130, in on_mode_change mode = st.session_state.dialogue_mode ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\butterfly\anaconda3\envs\chatchat_env\Lib\site-packages\streamlit\runtime\state\session_state_proxy.py", line 121, in getattr raise AttributeError(_missing_attr_error_message(key))

环境信息 / Environment Information

附加信息 / Additional Information 添加与问题相关的任何其他信息 / Add any other information related to the issue.

dosubot[bot] commented 6 months ago

Hey there, @zixiaotan21! Good to see you back with us. Hope you're doing well!

To resolve the AttributeError indicating that st.session_state has no attribute "dialogue_mode" when attempting to clear the cache from the top right corner of the WebUI interface in Langchain-Chatchat v0.2.10, you should ensure that the dialogue_mode attribute is initialized in st.session_state before it's accessed or modified. This can be done by setting a default value for dialogue_mode at the beginning of your application or before it's used.

In the provided context, the error can occur if the dialogue_mode attribute is accessed before it's set, especially on a fresh session or after clearing the cache, which resets the session state. To prevent this, you can modify the dialogue_page function or the section where dialogue_mode is first accessed to include a default initialization if it's not already present in st.session_state.

Here's a modification you can apply:

def dialogue_page(api: ApiRequest, is_lite: bool = False):
    # Initialize 'dialogue_mode' in session state with a default value if it doesn't exist
    if 'dialogue_mode' not in st.session_state:
        st.session_state['dialogue_mode'] = "LLM 对话"  # Default mode, adjust as needed

    st.session_state.setdefault("conversation_ids", {})
    st.session_state["conversation_ids"].setdefault(chat_box.cur_chat_name, uuid.uuid4().hex)
    st.session_state.setdefault("file_chat_id", None)
    default_model = api.get_default_llm_model()[0]

    # The rest of your function continues here...

This code snippet checks if dialogue_mode is not in st.session_state and sets it to a default value ("LLM 对话" in this example, but you can adjust it to your application's needs). This ensures that dialogue_mode is always initialized before any operation is performed on it, thus preventing the AttributeError.

The dialogue_mode attribute in the Langchain-Chatchat application is used to select the mode of dialogue interaction. It allows users to choose between different types of dialogue, including:

  1. LLM 对话 (LLM Dialogue): This mode likely involves interaction with a large language model (LLM) for generating responses.
  2. 知识库问答 (Knowledge Base Q&A): This mode is for querying a knowledge base to find answers.
  3. 文件对话 (File Dialogue): This mode seems to involve uploading documents and using them as a basis for dialogue or answering questions.
  4. 搜索引擎问答 (Search Engine Q&A): This mode uses a search engine to find answers to user queries.
  5. 自定义Agent问答 (Custom Agent Q&A): This mode involves interaction with a custom agent, possibly for specialized or domain-specific inquiries.

The dialogue_mode attribute determines the operational context of the chat, affecting how user inputs are processed and how responses are generated. For instance, in "知识库问答" mode, the application might query a specified knowledge base to find relevant answers, while in "LLM 对话" mode, it might generate responses using a large language model. The choice of mode influences the backend processing of queries and the generation of responses, tailoring the chatbot's behavior to the selected dialogue context.

You can find more details in the dialogue.py file in the Langchain-Chatchat repository.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
github-actions[bot] commented 5 months ago

这个问题已经被标记为 stale ,因为它已经超过 30 天没有任何活动。