Closed chunf closed 5 months ago
To resolve the issue with importing ConversationSummaryBufferMemory
from 'langchain/memory', you should use the following import statement:
import { ConversationSummaryBufferMemory } from 'langchain/memory/index';
Here is an example usage:
import { OpenAI, ChatOpenAI } from "@langchain/openai";
import { ConversationSummaryBufferMemory } from "langchain/memory/index";
import { ConversationChain } from "langchain/chains";
import {
ChatPromptTemplate,
HumanMessagePromptTemplate,
MessagesPlaceholder,
SystemMessagePromptTemplate,
} from "@langchain/core/prompts";
// summary buffer memory
const memory = new ConversationSummaryBufferMemory({
llm: new OpenAI({ model: "gpt-3.5-turbo-instruct", temperature: 0 }),
maxTokenLimit: 10,
});
await memory.saveContext({ input: "hi" }, { output: "whats up" });
await memory.saveContext({ input: "not much you" }, { output: "not much" });
const history = await memory.loadMemoryVariables({});
console.log({ history });
// We can also get the history as a list of messages (this is useful if you are using this with a chat prompt).
const chatPromptMemory = new ConversationSummaryBufferMemory({
llm: new ChatOpenAI({ model: "gpt-3.5-turbo", temperature: 0 }),
maxTokenLimit: 10,
returnMessages: true,
});
await chatPromptMemory.saveContext({ input: "hi" }, { output: "whats up" });
await chatPromptMemory.saveContext(
{ input: "not much you" },
{ output: "not much" }
);
// We can also utilize the predict_new_summary method directly.
const messages = await chatPromptMemory.chatHistory.getMessages();
const previous_summary = "";
const predictSummary = await chatPromptMemory.predictNewSummary(
messages,
previous_summary
);
console.log(JSON.stringify(predictSummary));
// Using in a chain
const chatPrompt = ChatPromptTemplate.fromMessages([
SystemMessagePromptTemplate.fromTemplate(
"The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know."
),
new MessagesPlaceholder("history"),
HumanMessagePromptTemplate.fromTemplate("{input}"),
]);
const model = new ChatOpenAI({ temperature: 0.9, verbose: true });
const chain = new ConversationChain({
llm: model,
memory: chatPromptMemory,
prompt: chatPrompt,
});
const res1 = await chain.invoke({ input: "Hi, what's up?" });
console.log({ res1 });
const res2 = await chain.invoke({
input: "Just working on writing some documentation!",
});
console.log({ res2 });
const res3 = await chain.invoke({
input: "For LangChain! Have you heard of it?",
});
console.log({ res3 });
const res4 = await chain.invoke({
input:
"That's not the right one, although a lot of people confuse it for that!",
});
console.log({ res4 });
This should resolve the issue without needing to modify the package.json
file.
To continue the conversation, mention @dosu.
Very sorry about that, just shipped a fix to readd the old entrypoint. It is live in langchain@0.2.2
Checked other resources
Example Code
import { ConversationSummaryBufferMemory } from 'langchain/memory'
Error Message and Stack Trace (if applicable)
Error: Package subpath './memory' is not defined by "exports" in node_modules\langchain\package.json
Description
I am trying to use ConversationSummaryBufferMemory by importing it from 'langchain/memory'. The issue is fix if I modify line 1233 of package.json of langchain to "./memory": { "types": { "import": "./memory/index.d.ts", "require": "./memory/index.d.cts", "default": "./memory/index.d.ts" }, "import": "./memory/index.js", "require": "./memory/index.cjs" },
System Info
Node v20.9.0 "langchain": "^0.2.1",