langchain-ai / langchainjs

πŸ¦œπŸ”— Build context-aware reasoning applications πŸ¦œπŸ”—
https://js.langchain.com/docs/
MIT License
12.34k stars 2.09k forks source link

Bedrock integration not working with Claude 3.5 Sonnet. Errors with "claude-3-5-sonnet-20240620" is not supported on this API. Please use the Messages API instead. #6845

Open stevedaskam opened 2 days ago

stevedaskam commented 2 days ago

Checked other resources

Example Code

import { Bedrock } from "@langchain/community/llms/bedrock";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import {
  ChatPromptTemplate,
  MessagesPlaceholder,
} from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new Bedrock({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
  });

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
  });

  const client = new Client({
    ...AwsSigv4Signer({
      region: "us-east-1",
      service: "aoss",
      getCredentials: () => {
        const credentialsProvider = defaultProvider();
        return credentialsProvider();
      },
    }),
    node: endpoint
  });

const vectorStore = new OpenSearchVectorStore(embeddings, {
    // OpenSearch Serverless configuration
    client: client,
    indexName: "my-index",
    service: 'aoss'
  });

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
  });

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
  ];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

Error Message and Stack Trace (if applicable)

{ "errorType": "Error", "errorMessage": "Error 400: \"claude-3-5-sonnet-20240620\" is not supported on this API. Please use the Messages API instead.", "trace": [ "Error: Error 400: \"claude-3-5-sonnet-20240620\" is not supported on this API. Please use the Messages API instead.", " at Bedrock._call (file:///opt/nodejs/node_modules/@langchain/community/dist/llms/bedrock/web.js:181:19)", " at process.processTicksAndRejections (node:internal/process/task_queues:95:5)", " at async Promise.all (index 0)", " at async Bedrock._generate (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:354:29)", " at async Bedrock._generateUncached (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:169:26)", " at async Bedrock.invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:32:24)", " at async RunnableSequence.invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/base.js:1144:33)", " at async RunnableBranch._invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/branch.js:129:22)", " at async RunnableBranch._callWithConfig (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/base.js:217:22)", " at async Runtime.handler (file:///var/task/index.mjs:149:18)" ] }

Description

I'm trying to use Bedrock with Claude 3.5 using the createHistoryAwareRetriever and getting the above error. I see that this issue was fixed in the python version of langchain, but didn't find any mention of it in this repo.

Related issue: https://github.com/langchain-ai/langchain/issues/18513

System Info

Running this on AWS lambda, Node.js 20.x

Lambda layer has these dependencies: "dependencies": { "@aws-crypto/sha256-js": "^5.2.0", "@aws-sdk/client-dynamodb": "^3.651.1", "@aws-sdk/client-opensearch": "^3.637.0", "@aws-sdk/credential-provider-node": "^3.637.0", "@aws-sdk/node-http-handler": "^3.374.0", "@aws-sdk/protocol-http": "^3.374.0", "@aws-sdk/signature-v4": "^3.374.0", "@langchain/aws": "^0.0.10", "@langchain/community": "^0.2.31", "@opensearch-project/opensearch": "^3.0.0-beta.5", "aws-sdk": "^2.1686.0", "langchain": "^0.2.17" }

dosubot[bot] commented 2 days ago

Hey @stevedaskam! I'm here to help you out with any bugs, questions, or contributions you might have. Let's work together to solve this issue!

I found a similar issue that might help resolve your problem. The error you're encountering is due to using an incorrect model identifier. The correct model identifier for Claude 3.5 should be "anthropic.claude-3-5-sonnet-20240620-v1:0".

Here's how you can update your code to use the correct model identifier:

import { BedrockChat } from "@langchain/community/chat_models/bedrock";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new BedrockChat({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
});

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
});

const client = new Client({
    ...AwsSigv4Signer({
        region: "us-east-1",
        service: "aoss",
        getCredentials: () => {
            const credentialsProvider = defaultProvider();
            return credentialsProvider();
        },
    }),
    node: endpoint
});

const vectorStore = new OpenSearchVectorStore(embeddings, {
    client: client,
    indexName: "my-index",
    service: 'aoss'
});

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
});

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

Ensure that you replace "claude-3-5-sonnet-20240620" with "anthropic.claude-3-5-sonnet-20240620-v1:0" in your code [1][2][3][4].

To continue talking to Dosu, mention @dosu.

stevedaskam commented 2 days ago

I am already using that model ID just with Bedrock, not BedrockChat

dosubot[bot] commented 2 days ago

To resolve the issue, you need to use the BedrockChat class instead of the Bedrock class for the Anthropic model. The Bedrock class does not support the claude-3-5-sonnet-20240620 model. Here is how you can update your code:

import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new BedrockChat({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
});

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
});

const client = new Client({
    ...AwsSigv4Signer({
        region: "us-east-1",
        service: "aoss",
        getCredentials: () => {
            const credentialsProvider = defaultProvider();
            return credentialsProvider();
        },
    }),
    node: endpoint
});

const vectorStore = new OpenSearchVectorStore(embeddings, {
    client: client,
    indexName: "my-index",
    service: 'aoss'
});

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
});

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

This should resolve the "Error 400: 'claude-3-5-sonnet-20240620' is not supported on this API" issue by using the correct BedrockChat class for the Anthropic model [1][2].

To continue talking to Dosu, mention @dosu.

stevedaskam commented 2 days ago

Will this work with ChatBedrockConverse?

dosubot[bot] commented 2 days ago

Yes, this will work with ChatBedrockConverse. Here is an example of how you can update your code to use ChatBedrockConverse:

import { ChatBedrockConverse } from "@langchain/aws";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new ChatBedrockConverse({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
    credentials: {
        accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
        secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
    },
});

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
});

const client = new Client({
    ...AwsSigv4Signer({
        region: "us-east-1",
        service: "aoss",
        getCredentials: () => {
            const credentialsProvider = defaultProvider();
            return credentialsProvider();
        },
    }),
    node: endpoint
});

const vectorStore = new OpenSearchVectorStore(embeddings, {
    client: client,
    indexName: "my-index",
    service: 'aoss'
});

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
});

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

This should resolve the issue by using the ChatBedrockConverse class, which supports the claude-3-5-sonnet-20240620 model [1][2].

To continue talking to Dosu, mention @dosu.