langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.51k stars 14.81k forks source link

DOC: streamlit memory parameters - memory_key and chat_history #13995

Closed annachrome closed 5 months ago

annachrome commented 9 months ago

Issue with current documentation:

Streamlit tutorial suggests

from langchain.memory import ConversationBufferMemory
from langchain.memory.chat_message_histories import StreamlitChatMessageHistory

# Optionally, specify your own session_state key for storing messages
msgs = StreamlitChatMessageHistory(key="special_app_key")

memory = ConversationBufferMemory(memory_key="history", chat_memory=msgs)
if len(msgs.messages) == 0:
    msgs.add_ai_message("How can I help you?")

but in version 0.340, definition of ConversationBufferMemory only includes parameter memory_key but not chat_history.

I have noticed my model outputs being less accurate for the same script when streamlit is incorporated and I suspect this is the issue.

Idea or request for content:

Please clarify how to best implement Streamlit with chat history.

dosubot[bot] commented 9 months ago

🤖

Hi @annachrome! Nice to see you again. I hope you're doing well.

Based on the information you provided and the context from the LangChain repository, it seems there might be some confusion about the ConversationBufferMemory class parameters. In the LangChain framework, you can use the StreamlitChatMessageHistory class to store and use chat message history in a Streamlit app. This class stores messages in the Streamlit session state at the specified key. The default key is "langchain_messages".

You can integrate StreamlitChatMessageHistory into ConversationBufferMemory and chains or agents as usual. The history will be persisted across re-runs of the Streamlit app within a given user session. A given StreamlitChatMessageHistory will NOT be persisted or shared across user sessions.

Here is an example of how to use it:

from langchain.memory import ConversationBufferMemory
from langchain.memory.chat_message_histories import StreamlitChatMessageHistory

# Optionally, specify your own session_state key for storing messages
msgs = StreamlitChatMessageHistory(key="special_app_key")

memory = ConversationBufferMemory(memory_key="history", chat_memory=msgs)
if len(msgs.messages) == 0:
    msgs.add_ai_message("How can I help you?")

In the above code, StreamlitChatMessageHistory is used to store the chat history. The ConversationBufferMemory is then created with the memory_key parameter set to "history" and the chat_memory parameter set to the StreamlitChatMessageHistory instance.

The StreamlitChatMessageHistory instance is then checked to see if it contains any messages. If it does not, an AI message is added to it.

This setup allows the chat history to be stored and used in a Streamlit app, and it can be integrated into the LangChain framework as usual. The chat history will be persisted across re-runs of the Streamlit app within a given user session.

You can find more details in the StreamlitChatMessageHistory notebook in the LangChain repository.

Regarding the changes made to the ConversationBufferMemory class in the transition from version 0.340 to the latest version v0.0.342, I wasn't able to find specific information within the repository. It's possible that the changes could be related to other aspects of the LangChain framework or that they are not directly related to the issue you're experiencing.

If you continue to experience issues with your model outputs being less accurate when incorporating Streamlit, it might be helpful to provide more details about the specific inaccuracies you're noticing. This could help in identifying whether the issue is related to the ConversationBufferMemory class or another aspect of the LangChain framework.

Sources

sfc-gh-jcarroll commented 8 months ago

Thanks @annachrome

You can fix it by just removing the memory_key arg from ConversationMemoryBuffer

I submitted a PR https://github.com/langchain-ai/langchain/pull/16072

There's also a working example code here: https://github.com/langchain-ai/streamlit-agent/blob/main/streamlit_agent/basic_memory.py (also linked in the doc already)