langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
93.71k stars 15.11k forks source link

StreamlitChatMessageHistory gives "KeyError: 'st.session_state has no key "langchain_messages" #18790

Closed pratikdel04 closed 3 months ago

pratikdel04 commented 7 months ago

Checked other resources

Example Code

from langchain_community.chat_message_histories import StreamlitChatMessageHistory import streamlit as st from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder from langchain_core.runnables.history import RunnableWithMessageHistory from langchain_community.chat_models import ChatCohere

Optionally, specify your own session_state key for storing messages

msgs = StreamlitChatMessageHistory(key="special_app_key")

if len(msgs.messages) == 0: msgs.add_ai_message("How can I help you?")

prompt = ChatPromptTemplate.from_messages( [ ("system", "You are an AI chatbot having a conversation with a human."), MessagesPlaceholder(variable_name="history"), ("human", "{question}"), ] )

chain = prompt | ChatCohere(cohere_api_key="",model="command", max_tokens=256, temperature=0.75) chain_with_history = RunnableWithMessageHistory( chain, lambda session_id: msgs, # Always return the instance created earlier input_messages_key="question", history_messages_key="history", )

for msg in msgs.messages: st.chat_message(msg.type).write(msg.content)

if prompt := st.chat_input(): st.chat_message("human").write(prompt)

# As usual, new messages are added to StreamlitChatMessageHistory when the Chain is called.
config = {"configurable": {"session_id": "any"}}
response = chain_with_history.invoke({"question": prompt}, config)
st.chat_message("ai").write(response.content)

Error Message and Stack Trace (if applicable)

KeyError: 'st.session_state has no key "langchain_messages". Did you forget to initialize it? More info: https://docs.streamlit.io/library/advanced-features/session-state#initialization' Traceback: File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 535, in _run_script exec(code, module.dict) File "C:\Users\prakotian\Desktop\Projects\ChatData\chat.py", line 142, in response = chain_with_history.invoke({"question": prompt}, config) File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 4069, in invoke return self.bound.invoke( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 4069, in invoke return self.bound.invoke( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 2075, in invoke input = step.invoke( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 4069, in invoke return self.bound.invoke( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\passthrough.py", line 419, in invoke return self._call_with_config(self._invoke, input, config, kwargs) File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 1262, in _call_with_config context.run( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\config.py", line 326, in call_func_with_variable_args return func(input, kwargs) # type: ignore[call-arg] File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\passthrough.py", line 406, in _invoke *self.mapper.invoke( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 2712, in invoke output = {key: future.result() for key, future in zip(steps, futures)} File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 2712, in output = {key: future.result() for key, future in zip(steps, futures)} File "c:\users\prakotian\appdata\local\programs\python\python38\lib\concurrent\futures_base.py", line 444, in result return self.get_result() File "c:\users\prakotian\appdata\local\programs\python\python38\lib\concurrent\futures_base.py", line 389, in get_result raise self._exception File "c:\users\prakotian\appdata\local\programs\python\python38\lib\concurrent\futures\thread.py", line 57, in run result = self.fn(self.args, self.kwargs) File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 4069, in invoke return self.bound.invoke( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 3523, in invoke return self._call_with_config( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 1262, in _call_with_config context.run( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\config.py", line 326, in call_func_with_variable_args return func(input, kwargs) # type: ignore[call-arg] File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\base.py", line 3397, in _invoke output = call_func_with_variable_args( File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\config.py", line 326, in call_func_with_variable_args return func(input, **kwargs) # type: ignore[call-arg] File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_core\runnables\history.py", line 409, in _enter_history return hist.messages.copy() File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\langchain_community\chat_message_histories\streamlit.py", line 32, in messages return st.session_state[self._key] File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\streamlit\runtime\state\session_state_proxy.py", line 90, in getitem return get_session_state()[key] File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\streamlit\runtime\state\safe_session_state.py", line 91, in getitem return self._state[key] File "c:\users\prakotian\appdata\local\programs\python\python38\lib\site-packages\streamlit\runtime\state\session_state.py", line 400, in getitem raise KeyError(_missing_key_error_message(key))

Description

I am using the same code step by step as mentioned here "https://python.langchain.com/docs/integrations/memory/streamlit_chat_message_history"

Still getting the key error

System Info

System Information

OS: Windows OS Version: 10.0.19045 Python Version: 3.8.10 (tags/v3.8.10:3d8993a, May 3 2021, 11:48:03) [MSC v.1928 64 bit (AMD64)]

Package Information

langchain_core: 0.1.29 langchain: 0.1.11 langchain_community: 0.0.25 langsmith: 0.1.19 langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

AtomAsking commented 7 months ago

18684

JohanRain commented 6 months ago

The same problem occurred in langchain: 0.1.11 But after I updated it to 0.1.13, the problem was resolved