langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.32k stars 2.09k forks source link

Is there a way to set session based memory with initializeAgentExecutorWithOptions ? #4004

Closed KeenanFernandes2000 closed 8 months ago

KeenanFernandes2000 commented 8 months ago

I have a chatbot that has a bunch of dynamic tools, however when I publish it live the memory is shared, so I want to make session based memory. Is there a way to do that cause I have no clue.

const executor = await initializeAgentExecutorWithOptions(tools, model, { agentType: "chat-conversational-react-description", agentArgs: { prefix, }, });

I essentially want initializeAgentExecutorWithOptions to use session based memory.

dosubot[bot] commented 8 months ago

🤖

Hello @KeenanFernandes2000! I'm Dosu, a helpful bot here to assist you while we wait for a human maintainer. I can help you solve bugs, answer questions, and guide you on becoming a contributor to our repository. Let's get started on your issue!

Yes, there is a way to configure session-based memory with the function initializeAgentExecutorWithOptions in the LangChainJS framework. This can be done by passing a memory object in the options parameter of the function.

The memory object is an instance of BufferMemory which is used for session-based memory. It is used in the cases of "chat-conversational-react-description" and "openai-functions" agent types.

Here is an example of how you can do it:

const memory = new BufferMemory({
  returnMessages: true,
  memoryKey: "chat_history",
  inputKey: "input",
  outputKey: "output",
});

const executor = await initializeAgentExecutorWithOptions(tools, llm, {
  agentType: "chat-conversational-react-description",
  memory: memory,
});

In this example, BufferMemory is configured with returnMessages set to true, memoryKey set to "chat_history", inputKey set to "input", and outputKey set to "output". This configuration is used for the session-based memory.

The BufferMemory object in the LangChainJS framework is a class that extends the BaseChatMemory class and implements the BufferMemoryInput interface. It is used for storing and managing previous chat messages.

The BufferMemory class has the following properties:

  1. humanPrefix: This is a string that represents the prefix for human messages in the chat history. The default value is "Human".

  2. aiPrefix: This is a string that represents the prefix for AI messages in the chat history. The default value is "AI".

  3. memoryKey: This is a string that represents the key for the memory variable where the chat history is stored. The default value is "history".

The BufferMemory class also has a constructor that takes an optional BufferMemoryInput object as a parameter. This object can have the following properties: chatHistory, returnMessages, inputKey, outputKey, humanPrefix, aiPrefix, and memoryKey. The constructor initializes the BufferMemory object with these values or their default values if they are not provided.

The BufferMemory class has a memoryKeys getter that returns an array containing the memoryKey.

The BufferMemory class also has a loadMemoryVariables method that takes an InputValues object as a parameter and returns a Promise that resolves with a MemoryVariables object. This method loads the memory variables from the chat history.

You can find more details in the source code of the LangChainJS framework:

I hope this helps! If you have any other questions, feel free to ask.

Best, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

KeenanFernandes2000 commented 8 months ago

@dosu-bot does the session based memory ever clear?. also how do i combine this solution with UpstashRedisChatMessageHistory

dosubot[bot] commented 8 months ago

🤖

Yes, you can configure the initializeAgentExecutorWithOptions function in langchainjs to use session-based memory. The BufferMemory class is used to create a new memory instance, and the UpstashRedisChatMessageHistory class is used to store the chat history in a Redis database. The sessionId parameter is used to uniquely identify the conversation, and the sessionTTL parameter is used to set the time-to-live (TTL) for the session. If you omit the sessionTTL parameter, the session will never expire.

To ensure the session memory clears appropriately, you can set the sessionTTL parameter to the desired time in seconds. After this time, the session will automatically expire and the memory will be cleared.

Here is the code snippet that shows how to do this:

import { BufferMemory } from "langchain/memory";
import { UpstashRedisChatMessageHistory } from "@langchain/community/stores/message/upstash_redis";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { ConversationChain } from "langchain/chains";

const memory = new BufferMemory({
  chatHistory: new UpstashRedisChatMessageHistory({
    sessionId: new Date().toISOString(), // Or some other unique identifier for the conversation
    sessionTTL: 300, // 5 minutes, omit this parameter to make sessions never expire
    config: {
      url: "https://ADD_YOURS_HERE.upstash.io", // Override with your own instance's URL
      token: "********", // Override with your own instance's token
    },
  }),
});

const model = new ChatOpenAI({
  modelName: "gpt-3.5-turbo",
  temperature: 0,
});

const chain = new ConversationChain({ llm: model, memory });

// ... rest of your code

In this code, the BufferMemory is configured with UpstashRedisChatMessageHistory to use session-based memory. The sessionTTL is set to 300 seconds (5 minutes), so the session memory will clear after 5 minutes.

You can find more information about this in the examples/src/memory/upstash_redis.ts file in the langchainjs repository.

The initializeAgentExecutorWithOptions function in the langchainjs framework is used to initialize an agent executor with options. It takes three parameters: an array of tools, a language model, and options for the agent. The options include the agent type, agent arguments, and other options for AgentExecutor.fromAgentAndTools.

Depending on the agent type specified in the options, the function creates an instance of the corresponding agent (ZeroShotAgent, ChatAgent, ChatConversationalAgent, XMLAgent, StructuredChatAgent, or OpenAIAgent) using the provided language model and tools. It then creates an instance of AgentExecutor using the created agent, the provided tools, and the remaining options.

You can find more information about this in the langchain/src/agents/initialize.ts file in the langchainjs repository.

The BufferMemory class in the langchainjs framework is a type of memory component used for storing and managing previous chat messages. It is a wrapper around ChatMessageHistory that extracts the messages into an input variable. This class is particularly useful in applications like chatbots where it is essential to remember previous interactions.

You can find more information about this in the langchain/src/memory/buffer_memory.ts file in the langchainjs repository.

I hope this helps! If you have any other questions, feel free to ask.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

KeenanFernandes2000 commented 8 months ago

@dosu-bot i have initializeAgentExecutorWithOptions and some dynamic tools that i have created, how do i tell the agent when to use the dynamic tool and when not too, and once the tool is used is there a way to disable that dynamic tool.

dosubot[bot] commented 8 months ago

🤖

To configure the initializeAgentExecutorWithOptions function in the LangChainJS framework to use session-based memory, clear it when necessary, integrate it with UpstashRedisChatMessageHistory, and control the usage of dynamic tools, you can follow the steps below:

  1. Import the necessary modules and classes:
import { BufferMemory } from "langchain/memory";
import { UpstashRedisChatMessageHistory } from "@langchain/community/stores/message/upstash_redis";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { ConversationChain } from "langchain/chains";
  1. Initialize the session-based memory with UpstashRedisChatMessageHistory:
const memory = new BufferMemory({
  chatHistory: new UpstashRedisChatMessageHistory({
    sessionId: new Date().toISOString(), // Or some other unique identifier for the conversation
    sessionTTL: 300, // 5 minutes, omit this parameter to make sessions never expire
    config: {
      url: "https://ADD_YOURS_HERE.upstash.io", // Override with your own instance's URL
      token: "********", // Override with your own instance's token
    },
  }),
});
  1. Initialize the language model:
const model = new ChatOpenAI({
  modelName: "gpt-3.5-turbo",
  temperature: 0,
});
  1. Initialize the ConversationChain with the language model and memory:
const chain = new ConversationChain({ llm: model, memory });
  1. Use the chain.call method to interact with the language model and store the conversation history in the session-based memory:
const res1 = await chain.call({ input: "Hi! I'm Jim." });
console.log({ res1 });

const res2 = await chain.call({ input: "What did I just say my name was?" });
console.log({ res2 });
  1. To clear the session-based memory when necessary, you can use the clear method of the BufferMemory class:
memory.clear();

Please note that the initializeAgentExecutorWithOptions function is not mentioned in the provided context. Therefore, the configuration of this function is not covered in this answer. You can find more information about this function in the langchain/src/agents/initialize.ts file in the LangChainJS repository.

Regarding your question about dynamic tools, could you please provide more information about how you are using these tools and what you mean by "disabling" a tool? This will help me provide a more accurate answer.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

KeenanFernandes2000 commented 8 months ago

@dosu-bot i have custom dynamic tools how do I tell the agent when to use those tools and when not to

dosubot[bot] commented 8 months ago

🤖

To set session-based memory with initializeAgentExecutorWithOptions, you can use the BufferMemory class from LangChainJS. This class is designed to manage session-based memory. Here's an example of how you can use it:

import { BufferMemory } from "langchain/memory";

const memory = new BufferMemory({
  returnMessages: true,
  memoryKey: "chat_history",
  inputKey: "input",
  outputKey: "output",
});

const executor = await initializeAgentExecutorWithOptions(tools, model, {
  agentType: "chat-conversational-react-description",
  agentArgs: {
    prefix,
  },
  memory: memory,
});

In this example, a new BufferMemory instance is created and passed as the memory option when calling initializeAgentExecutorWithOptions. The BufferMemory instance is configured to return messages and uses "chat_history", "input", and "output" as the keys for memory, input, and output respectively.

As for clearing the session-based memory, it depends on how you've implemented your BufferMemory instance. If you've set it up to clear after a certain condition (like after a certain amount of time or after a certain event), then it will clear. Otherwise, you'll need to manually clear it by calling the clear method on your BufferMemory instance.

To integrate UpstashRedisChatMessageHistory with initializeAgentExecutorWithOptions, you would need to pass the memory object, which includes the UpstashRedisChatMessageHistory instance, as an argument to the initializeAgentExecutorWithOptions function. However, the context provided does not include the initializeAgentExecutorWithOptions function, so I can't provide a specific code example.

Regarding your question about dynamic tools, you can control when to use a tool by implementing conditions in your tool's execute method. For example, you could check if a certain condition is met before executing the tool's logic. Here's an example:

class MyTool {
  async execute(input, context) {
    if (context.shouldUseMyTool) {
      // Tool logic here
    }
  }
}

In this example, the tool checks if context.shouldUseMyTool is true before executing its logic. You can set context.shouldUseMyTool to true or false depending on whether you want to use the tool or not.

Once a tool is used, you can disable it by setting a flag in your context. For example, you could set context.shouldUseMyTool to false after the tool is used. Here's an example:

class MyTool {
  async execute(input, context) {
    if (context.shouldUseMyTool) {
      // Tool logic here

      // Disable the tool
      context.shouldUseMyTool = false;
    }
  }
}

In this example, the tool sets context.shouldUseMyTool to false after it is used, effectively disabling the tool.

I hope this helps! If you have any more questions, feel free to ask.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.