langchain-ai / langchainjs

πŸ¦œπŸ”— Build context-aware reasoning applications πŸ¦œπŸ”—
https://js.langchain.com/docs/
MIT License
12.75k stars 2.2k forks source link

BedrockChat throws error on tool use with "anthropic.claude-3-5-sonnet-20240620-v1:0" and "anthropic.claude-3-haiku-20240307-v1:0" #7237

Open goldeneagle3636 opened 1 day ago

goldeneagle3636 commented 1 day ago

Checked other resources

Example Code

import { BedrockChat } from "@langchain/community/chat_models/bedrock"
import { tool } from "@langchain/core/tools"
import { AgentExecutor, createToolCallingAgent } from 'langchain/agents'
import { ChatPromptTemplate } from '@langchain/core/prompts'

// anthropic.claude-3-haiku-20240307-v1:0
const model = new BedrockChat({
  model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
  region: "us-east-1",
  credentials: {
    accessKeyId: <>,
    secretAccessKey: <>,
  }
})

const imageTool = tool(
  async () => {
    return "image url";
  },
  {
    name: "Get-Image-Tool",
    description:
      "Use this tool if the user asks you to send them an image/picture",
  }
);

const systemMessage = "you are a helpful assistant"

const prompt = ChatPromptTemplate.fromMessages([
  ['system', systemMessage],
  ['placeholder', '{chat_history}'],
  ['human', '{input}'],
  ['placeholder', '{agent_scratchpad}'],
])

const agent = createToolCallingAgent({
  llm: model,
  tools: [imageTool],
  prompt,
  streamRunnable: true,
})
const agentExecutor = new AgentExecutor({
  agent,
  tools: [imageTool],
  verbose: false,
})

console.log("Agent created")

const result = await agentExecutor.invoke({input:  "use the image tool to get the image url for a picture of a falcon and report what you get back"})

console.log(result)

Error Message and Stack Trace (if applicable)

file:///.../node_test/node_modules/@langchain/community/dist/utils/bedrock/anthropic.js:131 throw new Error("Unsupported message content format"); ^

Error: Unsupported message content format at file:///.../c/node_test/node_modules/@langchain/community/dist/utils/bedrock/anthropic.js:131:23 at Array.map () at _formatContent (file:///.../c/node_test/node_modules/@langchain/community/dist/utils/bedrock/anthropic.js:101:39) at file:///.../c/node_test/node_modules/@langchain/community/dist/utils/bedrock/anthropic.js:190:30 at Array.map () at formatMessagesForAnthropic (file:///.../c/node_test/node_modules/@langchain/community/dist/utils/bedrock/anthropic.js:147:52) at BedrockLLMInputOutputAdapter.prepareMessagesInput (file:///.../c/node_test/node_modules/@langchain/community/dist/utils/bedrock/index.js:131:61) at BedrockChat._signedFetch (file:///.../c/node_test/node_modules/@langchain/community/dist/chat_models/bedrock/web.js:688:44) at BedrockChat._streamResponseChunks (file:///.../c/node_test/node_modules/@langchain/community/dist/chat_models/bedrock/web.js:737:37) at _streamResponseChunks.next ()

Node.js v20.16.0

Description

I am using the langchain library as illustrated above. It appears that the response from bedrock has a new json format that the BedrockChat class cannot yet support. It seems like the patch is not too in depth but am hoping the langchain team can prioritize making this class work with bedrock again as amazon has changed the output formats.

System Info

npm --version 10.8.1 node --version v20.16.0 Platform MacOS Sonoma 14.4.1 (23E224)

dosubot[bot] commented 1 day ago

Hey @goldeneagle3636! I'm here to help you with the issue you're facing. Let's work together to solve this!

I found some similar issues and discussions that might be relevant to your problem:

  1. Bedrock integration not working with Claude 3.5 Sonnet: This issue was closed, and the suggested solution was to use the BedrockChat class instead of the Bedrock class for the Anthropic model. The Bedrock class does not support the claude-3-5-sonnet-20240620 model. You should update your code to use the BedrockChat class, which is designed for chat models like Claude 3.5 Sonnet [1].

  2. Bedrock Inference Model IDs are out of support: This issue was closed after a pull request added support for Bedrock cross-region inference models. The solution involved updating the model ID in LangChain.js to support the Bedrock Inference Model eu.anthropic.claude-3-5-sonnet-20240620-v1:0 by using the ChatBedrockConverse model [2].

  3. Tool-use with BedrockChat llama3.1: This discussion is still open and might provide additional insights or solutions related to tool usage with BedrockChat [3].

The error you're encountering suggests that the BedrockChat class is not handling the message content format correctly. This might be due to changes in the response format from Amazon Bedrock. You might want to check if there are any updates or patches available for the LangChain.js library that address this compatibility issue.

To continue talking to Dosu, mention @dosu.

jacoblee93 commented 10 hours ago

Thanks for reporting @goldeneagle3636!

CC @bracesproul could you have a look?