Open selvaradov opened 2 months ago
okay I am trying to replicate this issue and see.
@selvaradov can you send the entire traceback please?
@keenborder786 Thanks! There is no further stack trace that shows up unfortunately, is there a way that I can find it apart from in the terminal / notebook output?
Let me know if you can't replicate it, but the code above was sufficient to cause the issue in a fresh Colab notebook for me. (I think the await
syntax may be different if not running in Jupiter?)
I was not able to replicate it but can you run the following, see if it works for you (This basically follows the old style of using memory but it might work for your use case):
from langchain_openai import OpenAIEmbeddings
from langchain_anthropic import ChatAnthropic
from langchain.agents import create_tool_calling_agent, AgentExecutor
from langchain.tools.retriever import create_retriever_tool
from langchain_core.prompts import (
ChatPromptTemplate,
MessagesPlaceholder,
)
from langchain.chains.query_constructor.schema import AttributeInfo
from langchain_community.chat_message_histories import SQLChatMessageHistory
from langchain.retrievers import SelfQueryRetriever
from langchain.schema import Document
import asyncio
from langchain.vectorstores import Chroma
from langchain.memory import ConversationBufferMemory
# Initialize LLM
llm = ChatAnthropic(
model="claude-3-5-sonnet-20240620",
max_tokens_to_sample=8192,
)
example_doc = Document("In 2014 something very important happened")
embeddings = OpenAIEmbeddings()
vectorstore = Chroma.from_documents(documents=[example_doc], embedding=embeddings)
def create_self_query_retriever(vectorstore):
metadata_field_info = [
AttributeInfo(
name="date",
description=f"The year associated with the information.",
type="string",
),
]
document_content_description = "Landmark developments in AI."
return SelfQueryRetriever.from_llm(
llm,
vectorstore,
document_content_description,
metadata_field_info,
)
self_query_retriever = create_self_query_retriever(vectorstore)
prompt = ChatPromptTemplate.from_messages(
[
MessagesPlaceholder(variable_name="history"),
("human", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
tool = create_retriever_tool(
self_query_retriever,
"ai_retriever",
"Searches for information about developments in AI.",
)
tools = [tool]
# Create the agent
memory_history = SQLChatMessageHistory(
session_id="",
connection="sqlite:///chats.db",
async_mode=False,
)
memory = ConversationBufferMemory(chat_memory = memory_history,
input_key = "input", memory_key = "history", return_messages = True)
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True, memory=memory)
async def run_agent_with_updates(agent, query, sid):
config = {"configurable": {"session_id": sid}}
async for event in agent_executor.astream_events(
{"input": query},
config,
version="v2",
):
kind = event["event"]
if kind == "on_chat_model_stream":
content = event["data"]["chunk"].content
if content:
print(content, end="", flush=True)
async def main(session_id:str):
agent_executor.memory.chat_memory.session_id = session_id
await run_agent_with_updates(agent_executor, "What was the main development in AI in 2014?", "123")
asyncio.run(main('foo'))
Here is a notebook to replicate the issue precisely: https://colab.research.google.com/drive/1OIziMD6Bk9YEgVWNFycV7aGWEoWoTU8O?usp=sharing. You have to scroll all the way along to the right of the two output cells and you'll see the WARNING:langchain_core.callbacks.manager:Error in AsyncRootListenersTracer.on_chain_end callback: AttributeError("'dict' object has no attribute 'type'")
, and then also from the second reply it's clear that the agent isn't able to recall history.
[Edit: I've added some more context to the notebook to show that this is only a problem which occurs with the Anthropic model, not OpenAI ones.]
I'll try your solution and see if it works for my use case. One difference I can see now is that it's not using the async mode for the SQLChat history, which possibly matters
Yep, the solution you came up with is working. The main issue about the AsyncRootListenersTracer
isn't fixed but this is a helpful workaround, so thanks!
Two questions about the new code:
memory_history
, why is it that session_id
gets set to ''
, and then manually gets modified through agent_executor.memory.chat_memory.session_id = session_id
?@selvaradov :
RunnableWithMessageHistory
it is done automatically behind the scenes (same thing is being done).SQLChatMessageHistory
are not supported when AgentExecutor
is executed directly.
Checked other resources
Example Code
Try the following code:
And then ask a question relying on context:
Error Message and Stack Trace (if applicable)
Description
chats.db
SQLite databaseSystem Info