hwchase17 / langchain-streamlit-template

268 stars 135 forks source link

Memory is not getting stored in conversation flows in streamlit app #2

Open anushkayadav opened 1 year ago

anushkayadav commented 1 year ago
Screenshot 2023-03-15 at 9 30 06 PM

Verbose output

Screenshot 2023-03-15 at 9 30 49 PM

The history is not getting saved. I defined the chain as follows

chain = ConversationChain(llm=llm,
                              verbose=True,
                              memory=ConversationBufferMemory())

I also tried using prompts and LLM chain

#prompt= prompt definition here
prompt = PromptTemplate(
    input_variables=["history", "human_input"], 
    template=template
)

chatgpt_chain = LLMChain(
    llm=llm, 
    prompt=prompt, 
    verbose=True, 
    memory=ConversationBufferWindowMemory(k=4),
)

output = chatgpt_chain.predict(human_input="Hello")

All of this works fine on notebooks but in streamlit app, the history is not getting passes in prompt

samdobson commented 1 year ago

Hey. I made a PR with a suggested approach to ensuring memory persists between messages: take a look at #1

anushkayadav commented 1 year ago

@samdobson This works! Thanks