Closed Jalmeida1994 closed 2 months ago
Hey @Jalmeida1994! π I'm here to help you out with any bugs, questions, or contributions while we wait for a human maintainer. Let's squash those bugs together!
To resolve the ValidationException
error indicating that the text field in the ContentBlock
object at messages.1.content.0
is blank, you need to ensure that all messages passed to the invoke
method contain non-empty content.
Here's an example of how you can invoke the ChatBedrockConverse
model with properly populated messages:
import { ChatBedrockConverse } from "@langchain/aws";
import { HumanMessage } from "@langchain/core/messages";
const model = new ChatBedrockConverse({
model: "anthropic.claude-3-sonnet-20240229-v1:0",
region: "us-east-1",
credentials: {
accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
},
});
const messages = [
new HumanMessage({ content: "Print hello world" }),
new HumanMessage({ content: "What is the weather today?" })
];
const res = await model.invoke(messages);
console.log(res);
Ensure that each HumanMessage
or any other message type you use has non-empty content. This should prevent the ValidationException
error related to blank text fields.
Additionally, the convertToConverseMessages
function in the libs/langchain-aws/src/common.ts
file ensures that the content of each message is properly converted to the required format. Make sure that your messages are correctly formatted and contain valid content blocks, as shown in the example [1][2][3].
To continue talking to Dosu, mention @dosu.
Hey @dosu, you friendly bot!
I understand the need to ensure that all messages passed to the invoke method contain non-empty content. However, the issue arises from the agent itself adding the empty content message. The content is empty because of the stopReason: "tool_use"
.
When the agent is set to use a tool, it stops and returns the tool that it will use. It executes the tool correctly, but in the messages history, we get a ToolMessage with the correct content. However, before that, we receive an AIMessage indicating tool usage but with empty content.
Example of the message history:
"HumanMessage": {
"lc_serializable": true,
"lc_kwargs": {
"content": "Use a tool please",
"additional_kwargs": {},
"response_metadata": {}
},
"lc_namespace": [ "langchain_core", "messages" ],
"content": "Use a tool please",
"name": "undefined",
"additional_kwargs": {},
"response_metadata": {},
"id": "undefined"
},
{
"AIMessage": {
"lc_serializable": true,
"lc_kwargs": {
"content": "",
"tool_calls": [{ "id": "tooluse_e68eeY31Rju83TWq5n-e1A", "name": "retrieve_internal_documents", "args": {}}],
"response_metadata": {
"$metadata": { "httpStatusCode": 200, "requestId": "fba07d1c-cfda-4609-a603-ad589da3374a", "attempts": 1, "totalRetryDelay": 0 },
"metrics": { "latencyMs": 1718 },
"stopReason": "tool_use",
"usage": { "inputTokens": 446, "outputTokens": 56, "totalTokens": 502 }
},
"usage_metadata": { "input_tokens": 446, "output_tokens": 56, "total_tokens": 502 },
"id": "fba07d1c-cfda-4609-a603-ad589da3374a",
"invalid_tool_calls": [],
"additional_kwargs": {}
}
}
},
"ToolMessage": {
"lc_serializable": true,
"lc_kwargs": {
"content": "bla bla"
<rest of correct output from tool>
This issue didn't occur with ChatOpenAI
or BedrockChat
, and only started after upgrading to ChatBedrockConverse
. I believe the agent is correctly performing the tool usage but somehow leaves an empty AIMessage behind and the Converse API does not like that.
The issue you're encountering with ChatBedrockConverse
seems to be related to how the agent handles tool usage, resulting in an empty AIMessage
content. This behavior is specific to ChatBedrockConverse
and wasn't present with ChatOpenAI
or BedrockChat
.
To address this, you can filter out any AIMessage
objects with empty content before invoking the model. Here's an example of how you can modify your code to achieve this:
import { ChatBedrockConverse } from "@langchain/aws";
import { HumanMessage, AIMessage } from "@langchain/core/messages";
const model = new ChatBedrockConverse({
model: "anthropic.claude-3-sonnet-20240229-v1:0",
region: "us-east-1",
credentials: {
accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
},
});
const filterEmptyMessages = (messages) => {
return messages.filter(message => {
if (message instanceof AIMessage && !message.content) {
return false;
}
return true;
});
};
const messages = [
new HumanMessage({ content: "Use a tool please" }),
new AIMessage({
content: "",
tool_calls: [{ id: "tooluse_e68eeY31Rju83TWq5n-e1A", name: "retrieve_internal_documents", args: {} }],
response_metadata: {
$metadata: { httpStatusCode: 200, requestId: "fba07d1c-cfda-4609-a603-ad589da3374a", attempts: 1, totalRetryDelay: 0 },
metrics: { latencyMs: 1718 },
stopReason: "tool_use",
usage: { inputTokens: 446, outputTokens: 56, totalTokens: 502 }
},
usage_metadata: { input_tokens: 446, output_tokens: 56, total_tokens: 502 },
id: "fba07d1c-cfda-4609-a603-ad589da3374a",
invalid_tool_calls: [],
additional_kwargs: {}
}),
new HumanMessage({ content: "What is the weather today?" })
];
const filteredMessages = filterEmptyMessages(messages);
const res = await model.invoke(filteredMessages);
console.log(res);
This code filters out any AIMessage
objects with empty content before passing the messages to the invoke
method, which should prevent the ValidationException
error [1][2][3].
To continue talking to Dosu, mention @dosu.
We tried filtering out any AIMessage objects with empty content before invoking the model. However, we encounter another issue when we do this. Hereβs the error message we receive:
ValidationException: The model returned the following errors: Your API request included an `assistant` message in the final position, which would pre-fill the `assistant` response. When using tools, pre-filling the `assistant` response is not supported.
at de_ValidationExceptionRes (/Users/test/code/bot/node_modules/.pnpm/@aws-sdk+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:1082:21)
at de_CommandError (/Users/test/code/bot/node_modules/.pnpm/@aws-sdk+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:937:19)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-serde@3.0.3/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20
at async /Users/test/code/bot/node_modules/.pnpm/@smithy+core@2.2.4/node_modules/@smithy/core/dist-cjs/index.js:165:18
at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-retry@3.0.7/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38
at async /Users/test/code/bot/node_modules/.pnpm/@aws-sdk+middleware-logger@3.598.0/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:34:22
at async ChatBedrockConverse._generateNonStreaming (/Users/test/code/bot/node_modules/.pnpm/@langchain+aws@0.0.1_@aws-sdk+client-sso-oidc@3.606.0_@aws-sdk+client-sts@3.606.0__@aws-sdk+c_wops3cctew2kpyygicqzgjk7su/node_modules/@langchain/aws/dist/chat_models.cjs:222:26)
at async Promise.allSettled (index 0)
at async ChatBedrockConverse._generateUncached (/Users/test/code/bot/node_modules/.pnpm/@langchain+core@0.2.12_langchain@0.2.8_@aws-sdk+credential-provider-node@3.600.0_@aws-sdk+cli_6wi7pugnjf7kzne7hkijzr5xv4/node_modules/@langchain/core/dist/language_models/chat_models.cjs:177:29)
It seems that when a ToolMessage is present as the last message, the Converse API does not generate a completion because the last message needs to be a human message - or maybe it's a completely unrelated issue ahah. This behavior didn't occur with ChatOpenAI
or BedrockChat
, and only started after upgrading to ChatBedrockConverse
.
+1
+1
Same issue as @Jalmeida1994 when a ToolMessage is present as the last message
FYI - not related to langchain but I had this problem when using the Bedrock Converse API. Switched to raw InvokeModel API and didnt have this anymore
@dsunjka if you don't mind, would you please be so kind to share your code snippet that solved this problem?
Was this fix also made to Python client?
Checked other resources
Example Code
Maybe any example from LanggraphJS Examples that uses tools, but with "ChatBedrockConverse" and model "anthropic.claude-3-sonnet-20240229-v1:0"?
Error Message and Stack Trace (if applicable)
Error in CompletionController.generateCompletion() The text field in the ContentBlock object at messages.1.content.0 is blank. Add text to the text field, and try again.","name":"ValidationException","stack":"ValidationException: The text field in the ContentBlock object at messages.1.content.0 is blank. Add text to the text field, and try again.\n at de_ValidationExceptionRes (/Users/test/code/bot/node_modules/.pnpm/@aws-sdk+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:1082:21)\n at de_CommandError (/Users/test/code/bot/node_modules/.pnpm/@aws-sdk+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:937:19)\n at processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-serde@3.0.3/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20\n at async /Users/test/code/bot/node_modules/.pnpm/@smithy+core@2.2.4/node_modules/@smithy/core/dist-cjs/index.js:165:18\n at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-retry@3.0.7/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38\n at async /Users/test/code/bot/node_modules/.pnpm/@aws-sdk+middleware-logger@3.598.0/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:34:22\n at async ChatBedrockConverse._generateNonStreaming (/Users/test/code/bot/nodemodules/.pnpm/@langchain+aws@0.0.1@aws-sdk+client-sso-oidc@3.606.0_@aws-sdk+client-sts@3.606.0__@aws-sdk+c_wops3cctew2kpyygicqzgjk7su/node_modules/@langchain/aws/dist/chat_models.cjs:222:26)\n at async Promise.allSettled (index 0)\n at async ChatBedrockConverse._generateUncached (/Users/test/code/bot/node_modules/.pnpm/@langchain+core@0.2.12langchain@0.2.8@aws-sdk+credential-provider-node@3.600.0_@aws-sdk+cli_6wi7pugnjf7kzne7hkijzr5xv4/node_modules/@langchain/core/dist/language_models/chat_models.cjs:177:29)","timestamp":"2024-07-02T13:59:04.567Z
Description
I'm encountering an issue when invoking the Bedrock Claude model via the Converse API using LangChain's
ChatBedrockConverse
. The error message is:This issue does not occur when using
ChatOpenAI
orBedrockChat
. It seems specific toChatBedrockConverse
.Steps to Reproduce:
ValidationException
due to a blanktext
field in theContentBlock
object.Expected Behavior: The model should handle tool usage without resulting in empty content messages, or the API should handle such cases gracefully - which I know it's not with Langchain.
Actual Behavior: The process fails with a
ValidationException
when an empty content message is encountered.Additional Information:
ChatBedrockConverse
.The model returned the following errors: Your API request included an 'assistant' message in the final position, which would pre-fill the 'assistant' response. When using tools, pre-filling the 'assistant' response is not supported.
Example AI message with empty content calling a tool:
PS: If you feel this issue might be better shared with the LangGraph JS maintainers or if it involves specific nuances of the AWS Bedrock Converse API, please let me know. I initially reported it here since ChatBedrockConverse is hosted within LangChain.
System Info