Open luizzappa opened 2 weeks ago
I found a similar solved discussion that might help: having difficult time implementing memory with an agent. The solution involves ensuring that the saveContext
and loadMemoryVariables
methods correctly manage the message content. You can refer to the LangChain documentation for more details [1].
The issue you're encountering with the AI Message content
becoming undefined
when using BufferWindowMemory
with Claude's model in LangChain.js is due to the BufferWindowMemory
class not correctly handling the message content when saving and loading the context. Here is a potential fix:
saveContext
method correctly saves the message content.loadMemoryVariables
method correctly loads the message content.Here is an example of how to fix the issue:
import { test, expect } from "@jest/globals";
import { HumanMessage, AIMessage } from "@langchain/core/messages";
import { InMemoryChatMessageHistory as ChatMessageHistory } from "@langchain/core/chat_history";
import { BufferWindowMemory } from "../buffer_window_memory.js";
test("Test buffer window memory", async () => {
const memory = new BufferWindowMemory({ k: 1 });
const result1 = await memory.loadMemoryVariables({});
expect(result1).toStrictEqual({ history: "" });
await memory.saveContext({ foo: "bar" }, { bar: "foo" });
const expectedString = "Human: bar\nAI: foo";
const result2 = await memory.loadMemoryVariables({});
expect(result2).toStrictEqual({ history: expectedString });
await memory.saveContext({ foo: "bar1" }, { bar: "foo" });
const expectedString3 = "Human: bar1\nAI: foo";
const result3 = await memory.loadMemoryVariables({});
expect(result3).toStrictEqual({ history: expectedString3 });
});
test("Test buffer window memory return messages", async () => {
const memory = new BufferWindowMemory({ k: 1, returnMessages: true });
const result1 = await memory.loadMemoryVariables({});
expect(result1).toStrictEqual({ history: [] });
await memory.saveContext({ foo: "bar" }, { bar: "foo" });
const expectedResult = [new HumanMessage("bar"), new AIMessage("foo")];
const result2 = await memory.loadMemoryVariables({});
expect(result2).toStrictEqual({ history: expectedResult });
await memory.saveContext({ foo: "bar1" }, { bar: "foo" });
const expectedResult2 = [new HumanMessage("bar1"), new AIMessage("foo")];
const result3 = await memory.loadMemoryVariables({});
expect(result3).toStrictEqual({ history: expectedResult2 });
});
test("Test buffer window memory with pre-loaded history", async () => {
const pastMessages = [
new HumanMessage("My name's Jonas"),
new AIMessage("Nice to meet you, Jonas!"),
];
const memory = new BufferWindowMemory({
returnMessages: true,
chatHistory: new ChatMessageHistory(pastMessages),
});
const result = await memory.loadMemoryVariables({});
expect(result).toStrictEqual({ history: pastMessages });
});
Additionally, you can refer to the BufferMemory
class implementation in langchain/src/memory/buffer_memory.ts
for a more robust handling of message content. This class includes methods for saving and loading context, managing token limits, and handling intermediate steps, which can be adapted to ensure the BufferWindowMemory
class correctly handles message content [2][3][4].
To continue talking to Dosu, mention @dosu.
Workaround with monkey patch:
import { InMemoryChatMessageHistory } from "@langchain/core/chat_history";
InMemoryChatMessageHistory.prototype.addAIChatMessage = function(...args) {
const isAnthropic = Array.isArray(args)
&& args.length > 0
&& typeof args[0] === 'object'
&& args[0] !== null
&& 'text' in (args as {})[0];
if (isAnthropic) {
return origFunc.call(this, (args[0] as any as { text: string }).text);
} else {
return origFunc.call(this, args as any);
}
}
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
The error occurs in this part of the trace, but the cause is that the AI Message
content
in memory has the valueundefined
.Description
It is not possible to use memory with Claude's model with an agent executor. If you remove the memory, everything works.
I spent a few hours debugging the reason. There is a lot of internal abstraction that I don't quite understand, but the reason for this bug starts here:
Here, the
returnValues
property is an array, while langchain's code expects a string.https://github.com/langchain-ai/langchainjs/blob/3cc45fe7d82bbfeb8b0a86c7c5a7547291c94218/langchain/src/agents/executor.ts#L537-L539
The code reaches this part with the
output
property of theoutputValues
variable as an array instead of string:https://github.com/langchain-ai/langchainjs/blob/3cc45fe7d82bbfeb8b0a86c7c5a7547291c94218/langchain/src/chains/base.ts#L123-L127
With that, this memory function takes the array instead of a string when get
output
property.https://github.com/langchain-ai/langchainjs/blob/3cc45fe7d82bbfeb8b0a86c7c5a7547291c94218/langchain-core/src/memory.ts#L88-L100
The
addAIChatMessage
function is called with an array instead of a string:https://github.com/langchain-ai/langchainjs/blob/3cc45fe7d82bbfeb8b0a86c7c5a7547291c94218/langchain-core/src/chat_history.ts#L61-L63
The
BaseMessage
class expectsfields
to be a string so it can be used as content, but since it is an array, this part of the code is never executed:https://github.com/langchain-ai/langchainjs/blob/3cc45fe7d82bbfeb8b0a86c7c5a7547291c94218/langchain-core/src/messages/base.ts#L239-L251
So, when creating AI Message, the content is
undefined
:https://github.com/langchain-ai/langchainjs/blob/3cc45fe7d82bbfeb8b0a86c7c5a7547291c94218/langchain-core/src/messages/base.ts#L263
System Info
N/A