Closed akshayghatiki311 closed 11 months ago
You can use RedisChatMessageHistory(session_id="user_id", redis_url) to create and store sessions for each user
You can use RedisChatMessageHistory(session_id="user_id", redis_url) to create and store sessions for each user
Yea, I've been using RedisChatMessageHistory. It works fine.
You can use RedisChatMessageHistory(session_id="user_id", redis_url) to create and store sessions for each user
Hi @gsparsh220598 I want to maintain multiple sessions based on session on my redis , messages are linked to corresponding session id . Any idea how to implement it ?
same issue here, can't store a seccion for each user here is my code"
self.llm = ChatOpenAI(model=self.preferred_model,
max_tokens=256,
temperature=0)
self.message_history = RedisChatMessageHistory(
url=f"redis://{self.redis_server}",
ttl=6000,
session_id='session_id'
)
self.agent = create_openai_tools_agent(self.llm, self.tools, self.prompt)
# Create an agent executor by passing in the agent and tools
self.agent_executor = AgentExecutor(agent=self.agent, tools=self.tools, verbose=True)
self.agent_with_chat_history = RunnableWithMessageHistory(
self.agent_executor,
# This is needed because in most real world scenarios, a session id is needed
# It isn't really used here because we are using a simple in memory ChatMessageHistory
lambda session_id: self.message_history,
input_messages_key="input",
history_messages_key="chat_history",
)
results = await self.agent_with_chat_history.ainvoke(
{"input": message},
config={"configurable": {
"session_id": f"{self.session_id}_{phone_number}",
'phone_number': phone_number,
'customer_name': customer_name
}},
)
nevermind i made a silly mistake, is to pass the session_id value like this:
lambda session_id: RedisChatMessageHistoryFixed(session_id, url=REDIS_URL),
Issue you'd like to raise.
embeddings = OpenAIEmbeddings(model='text-embedding-ada-002',deployment='XXXXX',chunk_size=1)
db = Chroma.from_documents(texts, embeddings)
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True,chat_memory=ChatMessageHistory(messages=[]))
model = ConversationalRetrievalChain.from_llm(llm=AzureOpenAI(model_name='gpt-35-turbo',deployment_name='XXXXX',temperature=0.7,openai_api_base=os.environ['OPENAI_API_BASE'],openai_api_key=os.environ['OPENAI_API_KEY']), retriever=db.as_retriever(), memory=memory)
I'm using ConversationalRetrievalChain to query from embeddings and also using memory to store chat history. I've written a Flask api to fetch result from model. I'm trying to store memory of each user in redis store, so that there wont be any mix and match in chat history. For that I'm trying with below code, but I'm unable to store memory in redis.
r = redis.Redis(host='redis-1XXX.c3XXX.ap-south-1-1.ec2.cloud.redislabs.com', port=17506, db=0, password='XXXX')
memory = {}
memory_key = f"memory_{employee_code}"
messages = r.get(memory_key)
messages = json.loads(messages)
memory[memory_key] = ConversationBufferMemory(memory_key="chat_history", return_messages=True, chat_memory=ChatMessageHistory(messages=messages))
model = ConversationalRetrievalChain.from_llm(llm=AzureOpenAI(model_name='gpt-35-turbo',deployment_name='XXXXXX',temperature=0.7,openai_api_base=os.environ['OPENAI_API_BASE'],openai_api_key=os.environ['OPENAI_API_KEY']), retriever=db.as_retriever(), memory=memory[memory_key])
res=model.run(query)
messages.append({query:res})
r.set(memory_key,json.dumps(messages))
This is not working. Can anyone help me on this?
Thanks in advance.
Suggestion:
No response