langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.64k stars 2.17k forks source link

Error: Missing value for input variable `history` with ConversationSummaryBufferMemory #6718

Open mauriciocirelli opened 1 month ago

mauriciocirelli commented 1 month ago

Dear,

I am using this exact example.

// Initialize the memory with a specific model and token limit
const memory = new ConversationSummaryBufferMemory({
  llm: new ChatOpenAI({ modelName: "gpt-3.5-turbo-instruct", temperature: 0 }),
  maxTokenLimit: 10,
});

// Save conversation context to memory
await memory.saveContext({ input: "hi" }, { output: "whats up" });
await memory.saveContext({ input: "not much you" }, { output: "not much" });

// Load the conversation history from memory
const history = await memory.loadMemoryVariables({});
console.log({ history });

// Create a chat prompt using the conversation history
const chatPrompt = ChatPromptTemplate.fromMessages([
  SystemMessagePromptTemplate.fromTemplate(
    "The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.",
  ),
  new MessagesPlaceholder("history"),
  HumanMessagePromptTemplate.fromTemplate("{input}"),
]);

// Initialize the conversation chain with the model, memory, and prompt
const chain = new ConversationChain({
  llm: new ChatOpenAI({ temperature: 0.9, verbose: true }),
  memory: memory,
  prompt: chatPrompt,
});

The history variable is not injected into the prompt, so I get the error.

Is it a bug in docs? Should we format the template to inject the history variable?

Thank you.

System Info

npm info langchain

langchain@0.2.18 | MIT | deps: 14 | versions: 291 Typescript bindings for langchain https://github.com/langchain-ai/langchainjs/tree/main/langchain/

dosubot[bot] commented 1 month ago

Yes, the template should be formatted to inject the history variable. The RunnableWithMessageHistory class is designed to manage and inject message history into the chain input. The example provided in the documentation shows that the history variable is intended to be used within the prompt template:

const prompt = ChatPromptTemplate.fromMessages([
  ["system", "You're an assistant who's good at {ability}"],
  new MessagesPlaceholder("history"),
  ["human", "{question}"],
]);

Additionally, the test case in history_aware_retriever.test.ts further confirms that the history (or chat_history) variable is expected to be injected into the prompt. The test demonstrates how the chat_history is passed and utilized within the prompt template:

const questionGenPrompt = ChatPromptTemplate.fromTemplate(
  `hi! {input} {chat_history}`
);

If the history variable is not being injected, it could be due to a misconfiguration or a bug in the implementation. The documentation also provides a detailed example of how to set up and use the RunnableWithMessageHistory class to ensure that message history is correctly managed and injected:

import { RunnableWithMessageHistory } from "@langchain/core/runnables";

const chainWithHistory = new RunnableWithMessageHistory({
  runnable: chain,
  getMessageHistory: (sessionId) =>
    new UpstashRedisChatMessageHistory({
      sessionId,
      config: {
        url: process.env.UPSTASH_REDIS_REST_URL!,
        token: process.env.UPSTASH_REDIS_REST_TOKEN!,
      },
    }),
  inputMessagesKey: "question",
  historyMessagesKey: "history",
});

This setup ensures that the history variable is properly injected into the prompt template, allowing the chain to utilize historical messages effectively [1][2][3].

To continue talking to Dosu, mention @dosu.