langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.16k stars 2.05k forks source link

How can we add Memory to the dynamictool #2477

Closed Boopalanoptisol closed 12 months ago

Boopalanoptisol commented 12 months ago

import { OpenAI } from "langchain/llms/openai"; import { initializeAgentExecutorWithOptions } from "langchain/agents"; import { DynamicTool } from "langchain/tools";

export const run = async () => { const model = new OpenAI({ temperature: 0 }); const tools = [ new DynamicTool({ name: "FOO", description: "call this to get the value of foo. input should be an empty string.", func: async () => "baz", }), new DynamicTool({ name: "BAR", description: "call this to get the value of bar. input should be an empty string.", func: async () => "baz1", }), ];

const executor = await initializeAgentExecutorWithOptions(tools, model, { agentType: "zero-shot-react-description", });

console.log("Loaded agent.");

const input = What is the value of foo?;

console.log(Executing with input "${input}"...);

const result = await executor.call({ input });

console.log(Got output ${result.output}); };

In the above code how can we add the memory. so that in the chatbot if human will ask the question related to the previous question it will give the answer by analyzing the previous humanmessages and give the response.

dosubot[bot] commented 12 months ago

🤖

To add memory to your DynamicTool, you can use the BufferMemory class provided by the LangChain framework. This class is designed to store and manage previous chat messages, which is exactly what you need for your chatbot to analyze and respond to questions based on previous human messages.

Here's how you can modify your code to incorporate BufferMemory:

import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutorWithOptions } from "langchain/agents";
import { DynamicTool } from "langchain/tools";
import { BufferMemory, ChatMessageHistory } from "langchain/memory";

export const run = async () => {
  const model = new OpenAI({ temperature: 0 });
  const tools = [
    new DynamicTool({
      name: "FOO",
      description:
        "call this to get the value of foo. input should be an empty string.",
      func: async () => "baz",
    }),
    new DynamicTool({
      name: "BAR",
      description:
        "call this to get the value of bar. input should be an empty string.",
      func: async () => "baz1",
    }),
  ];

  const memory = new BufferMemory({
    chatHistory: new ChatMessageHistory(),
    returnMessages: false,
    humanPrefix: "User",
    aiPrefix: "Bot",
    memoryKey: "conversationHistory",
  });

  const executor = await initializeAgentExecutorWithOptions(tools, model, {
    agentType: "zero-shot-react-description",
    memory: memory,
  });

  console.log("Loaded agent.");

  const input = `What is the value of foo?`;

  console.log(`Executing with input "${input}"...`);

  const result = await executor.call({ input });

  console.log(`Got output ${result.output}`);
};

In this code, a BufferMemory object is created and passed to the initializeAgentExecutorWithOptions function as part of the options object. This allows the executor to use the memory object to store and retrieve previous chat messages.

Please note that the BufferMemory class is a wrapper around ChatMessageHistory that extracts the messages into an input variable. It has a loadMemoryVariables method that loads the memory variables. If the returnMessages property is set to true, it returns the chat history as is. Otherwise, it returns the chat history as a string with the human and AI prefixes.

You can find more information about the BufferMemory class in the LangChain codebase.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

Boopalanoptisol commented 12 months ago

@dosu-beta No overload matches this call. Overload 1 of 2, '(tools: StructuredTool<ZodObject<any, any, any, any, { [x: string]: any; }>>[], llm: BaseLanguageModel, options: InitializeAgentExecutorOptionsStructured): Promise<...>', gave the following error. Type '"zero-shot-react-description"' is not assignable to type '"structured-chat-zero-shot-react-description" | "openai-functions"'. Overload 2 of 2, '(tools: Tool[], llm: BaseLanguageModel, options?: InitializeAgentExecutorOptions | undefined): Promise', gave the following error. Type 'BufferMemory' is not assignable to type 'undefined'.ts(2769) initialize.d.ts(34, 5): The expected type comes from property 'agentType' which is declared here on type 'Init

When i add the above get error like i mentioned above @dosu-beta

dosubot[bot] commented 12 months ago

🤖

Hello,

The TypeScript error you're encountering is due to the fact that the "zero-shot-react-description" agent type does not accept a memory object in its initialization options. As per the InitializeAgentExecutorOptions interface, the memory field is marked as never for the "zero-shot-react-description" agent type, which means it should not be provided.

If you want to use a BufferMemory object, you should use an agent type that accepts a memory object in its initialization options. For example, the "chat-conversational-react-description" and "openai-functions" agent types do accept a memory object.

Here's an example of how you can initialize the agent executor with the "chat-conversational-react-description" agent type and a BufferMemory object:

const tools: Tool[] = []; // Your tools here
const llm: BaseLanguageModel = {}; // Your language model here
const memory = new BufferMemory({
  returnMessages: true,
  memoryKey: "chat_history",
  inputKey: "input",
  outputKey: "output",
});

const agentExecutor = await initializeAgentExecutorWithOptions(tools, llm, {
  agentType: "chat-conversational-react-description",
  memory,
});

And here's an example with the "openai-functions" agent type:

const tools: StructuredTool[] = []; // Your tools here
const llm: BaseLanguageModel = {}; // Your language model here
const memory = new BufferMemory({
  returnMessages: true,
  memoryKey: "chat_history",
  inputKey: "input",
  outputKey: "output",
});

const agentExecutor = await initializeAgentExecutorWithOptions(tools, llm, {
  agentType: "openai-functions",
  memory,
});

Please replace the tools and llm placeholders with your actual tools and language model.

You can find more information about the different agent types available in the LangChain framework and their specific use cases in the langchain/src/agents/toolkits/index.ts, langchain/src/agents/agent.ts, examples/src/agents/custom_agent.ts, and langchain/src/agents/agent_toolkits/index.ts files.

I hope this helps! Let me know if you have any other questions.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

Boopalanoptisol commented 12 months ago

@dosu-bot

[chain/error] [1:chain:AgentExecutor] [1ms] Chain run errored with error: "Missing value for input variable chat_history" Trace: Error: Missing value for input variable chat_history

Getting error like this

dosubot[bot] commented 12 months ago

🤖

Hello,

The error message you're encountering, "Missing value for input variable chat_history", is likely due to the BufferMemory object expecting a value for the chat_history variable, but it's not being provided. In the code you've shared, the BufferMemory object is initialized with a memoryKey of "chat_history", but there's no code that actually sets a value for this key.

To resolve this issue, you need to ensure that the BufferMemory object has a value for the memoryKey ("chat_history") before you try to use it. You can do this by calling the push method on the BufferMemory object to add a message to the memory. Here's how you can modify your code:

import { OpenAI } from "langchain/llms/openai";
import { BufferMemory } from "langchain/memory";
import { LLMChain } from "langchain/chains";
import { PromptTemplate } from "langchain/prompts";

const memory = new BufferMemory({ memoryKey: "chat_history" });
memory.push('Initial chat history message'); // Add this line

const model = new OpenAI({ temperature: 0.9 });
const prompt =
  PromptTemplate.fromTemplate(`The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
{chat_history}
Human: {input}
AI:`);
const chain = new LLMChain({ llm: model, prompt, memory });

const res1 = await chain.call({ input: "Hi! I'm Jim." });
console.log({ res1 });

const res2 = await chain.call({ input: "What's my name?" });
console.log({ res2 });

In this modified code, I've added a line to push an initial message to the BufferMemory object. This ensures that there's a value for the "chat_history" key when you try to use the BufferMemory object in the LLMChain constructor.

In the context shared, it seems you are using the OpenAIAgentTokenBufferMemory class which has a default memoryKey of "history". If you are using this class instead of BufferMemory, you should ensure that the memoryKey in your PromptTemplate matches this default key, or you can set a custom memoryKey when initializing the OpenAIAgentTokenBufferMemory object.

The chat_history variable in the LangChain framework is used as a placeholder for the history of the conversation in the chat. It is used in the createPrompt method of the ChatConversationalAgent class, specifically in the "chat-conversational-react-description" agent type. The chat_history placeholder will be replaced with the actual history of the conversation when the prompt is rendered.

I hope this helps! Let me know if you have any other questions.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

jacoblee93 commented 12 months ago

You'll need to use an agent specifically designed for conversation like the OpenAI functions agent:

https://js.langchain.com/docs/modules/agents/agent_types/openai_functions_agent