langchain-ai / langgraphjs

⚡ Build language agents as graphs ⚡
https://langchain-ai.github.io/langgraphjs/
MIT License
512 stars 77 forks source link

createReactAgent no longer works with ChatAnthropic #519

Closed Pckool closed 3 hours ago

Pckool commented 3 hours ago

I'm using createReactAgent, and when I pass an OpenAI model it works fine, but when I pass an anthropic model, the model never get's the messages passed to the agent graph, even though it's there on the top level of the trace. Once it get's into the RunnableLanbda scope, the messages property is lost somehow? The system message is still there, but the other messages passed are not. I'm on the latest version for all packages:

{
  "@langchain/community": "0.3.2",
  "@langchain/core": "0.3.3",
  "@langchain/langgraph": "0.2.8",
  "@langchain/anthropic": "0.3.3",
  "@langchain/openai": "0.3.2",
  "langchain": "0.3.2",
}

Pics of langsmith traces below: CleanShot 2024-09-26 at 15 39 10@2x CleanShot 2024-09-26 at 15 39 54@2x

Here's an example of how I'm calling it:

export const claude35Sonnet = new ChatAnthropic({
  modelName: 'claude-3-5-sonnet-20240620',
  temperature: 0.45,
  streaming: true,
  anthropicApiKey: env.ANTHROPIC_API_KEY,
});

const prompt = ChatPromptTemplate.fromMessages(
  [
    SystemMessagePromptTemplate.fromTemplate(`some custom system instructions`),
  ]
);

const agent = createReactAgent({ 
  llm: claude35Sonnet, 
  tools: [
    // imagine a list of tools here
  ], 
  messageModifier: prompt 
});

const res = await agent.invoke({
  messages: [
    ['human', 'some random text, please do something for me AI :)']
  ]
})
jacoblee93 commented 3 hours ago

Hey @Pckool, you need to add a MessagesPlaceholder to your prompt where the agent will inject messages:

export const claude35Sonnet = new ChatAnthropic({
  modelName: 'claude-3-5-sonnet-20240620',
  temperature: 0.45,
  streaming: true,
  anthropicApiKey: env.ANTHROPIC_API_KEY,
});

const prompt = ChatPromptTemplate.fromMessages(
  [
    ["system", `some custom system instructions`],
    ["placeholder", "{messages}"],
  ]
);

const agent = createReactAgent({ 
  llm: model, 
  tools: [
    // imagine a list of tools here
  ], 
  messageModifier: prompt 
});

const res = await agent.invoke({
  messages: [
    { role: 'user', content: 'some random text, please do something for me AI :)'}
  ]
})

Please reopen if the above doesn't fix it!

Pckool commented 3 hours ago

Hey @jacoblee93 ! That was the first thing I tried and it throws an error saying messages is missing. Although, I looked at the code and createReactAgent already adds this placeholder so this shouldn't be needed (and as I mentioned, this code works perfectly when using an openai model).

CleanShot 2024-09-26 at 16 20 26@2x

Here's the traces: CleanShot 2024-09-26 at 16 21 14@2x CleanShot 2024-09-26 at 16 21 54@2x

After that didn't work I tried formattingMessages before sending it to the react agent like so:

const agent = createReactAgent({ 
  llm: claude35Sonnet, 
  tools: [
    // imagine a list of tools here
  ], 
  messageModifier: () => prompt.formatMessages({ messages: cleanedMessages })
});

but that doesn't work either, the messages get lost again in the RunnableLambda scope like above

Pckool commented 2 hours ago

I can't reopen this issue from here, but I created another issue here adding the context above: https://github.com/langchain-ai/langgraphjs/issues/521

Pckool commented 2 hours ago

Found the issue, it was a silly issue with how I was creating the messages, anthropic couldn't recognize the roles I was giving. Gave the full reason in #521

jacoblee93 commented 1 hour ago

Ah ok great! Glad you figured it out.