langflow-ai / langflow

⛓️ Langflow is a visual framework for building multi-agent and RAG applications. It's open-source, Python-powered, fully customizable, LLM and vector store agnostic.
http://www.langflow.org
MIT License
18.48k stars 2.78k forks source link

[Feature] CombinedMemory #848

Closed bonelli closed 6 months ago

bonelli commented 9 months ago

Hey there, Is there any way to use langchain's CombinedMemory with Langflow?

Here the docs: https://python.langchain.com/docs/modules/memory/multiple_memory Here the API: https://api.python.langchain.com/en/latest/memory/langchain.memory.combined.CombinedMemory.html

thanks

YamonBot commented 9 months ago

I have configured a custom component as shown in the Json file below.

Since each memory (conversationbufferedwindow, summary) unconditionally performs a storage function, I had to use two DBs connected to each other, which I did not want. I would like to see this feature implemented first.

image


---mysrc---

from langflow import CustomComponent from typing import Optional from langchain.memory import ( ConversationBufferMemory, CombinedMemory, ConversationSummaryMemory, )

class ConversationChainForMe(CustomComponent): display_name: str = "MultiMemory" description: str = "formymemory"

def build_config(self) -> dict:
    return {

        "verbose": {"type": "bool", "value": False, "display_name": "Verbose"},
    }

def build(
    self,

    verbose: bool,
    conv_memory: Optional[BaseMemory] = None,
    summary_memory: Optional[BaseMemory] = None,
) -> BaseMemory:

    return CombinedMemory(memories=[conv_memory, summary_memory])

---myprompt---

The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Summary of conversation: {history} Current conversation: {chat_history_lines} Human: {input} AI:

YamonBot commented 8 months ago

Although I am also conducting my own research. When a flow is called with API ENDPOINT in the Flowrun stage of Langflow, which I do not know, the key is unconditionally entered into the flow and the Memory_key is placed in Combinmemory to execute it. Since this is a key feature in agent implementation, I hope Langflow developers will consider this part important.

ogabrielluiz commented 8 months ago

Hey @yamonkjd Thanks for the input. Could you be more specific what use-case do you plan on solving and what is Langflow is doing differently from what you want to achieve?

YamonBot commented 8 months ago

Hey @yamonkjd Thanks for the input. Could you be more specific what use-case do you plan on solving and what is Langflow is doing differently from what you want to achieve?

I am attaching my flow. If you save it as json, it works, and you can actually use the free account for MongoDB connection information. I think that would be convenient. Yamon Agent and Memory are custom components that I composed myself. This form is recommended for cycles for versatility. And you can use any tool.

I declared the combined memory in the agent and had the base memory list input as a return from a component called Memories. Although it works in web UI with some bugs. When input is sent to the api endpoint, an error is returned indicating that Memory_key is not a property of combined memory. There seems to be a difference in how it operates. And in this configuration, if you do not create multiple MongoDBs, it will not work with an error and the agent's response value will be stored as many times as the number of memories.

YamonBot commented 8 months ago

@ogabrielluiz

I created a conversation flow by providing intermediate content, knowledge graph information, and a list of recent conversations as input. Since the memory component type expands the versatility of Lang Chain, I would like you to consider this configuration

YamonBot commented 8 months ago

image

Here is my concept

YamonBot commented 8 months ago

from typing import List, Optional from langflow import CustomComponent from langchain.prompts import SystemMessagePromptTemplate from langchain.tools import Tool from langchain.schema.memory import BaseMemory from langchain.llms.base import BaseLanguageModel from langchain.agents.agent import AgentExecutor from langchain.agents.openai_functions_agent.base import OpenAIFunctionsAgent from langchain.agents.openai_functions_multi_agent.base import OpenAIMultiFunctionsAgent from langchain.memory import CombinedMemory from langchain.agents.agent_toolkits.conversational_retrieval.openai_functions import ( _get_default_system_message, ) from langchain.prompts.chat import MessagesPlaceholder

class YamonAgent(CustomComponent): display_name: str = "Yamon Agent" description: str = "Conversational Agent that can use OpenAI's function calling API"

def build_config(self):
    return {
        "llm": {"display_name": "LLM"},
        "memories_component": {"display_name": "Memory Container"},
        "tools": {"is_list": True, "display_name": "Multi Tools"},
        "system_message": {"display_name": "SystemMessagePromptTemplate"},
    }

def build(
    self,
    llm: BaseLanguageModel,
    tools: List[Tool],
    memories_component: Optional[List[BaseMemory]] = None,
    # 타입 업데이트
    system_message: Optional[SystemMessagePromptTemplate] = None,
) -> AgentExecutor:
    _system_message = system_message or _get_default_system_message()  # type: ignore

    memory_key: List[str] = []

    if memories_component:
        for memory in memories_component:
            memory_key.extend(memory.memory_variables)

    extra_prompt_messages = [
        BaseMessagePromptTemplate(variable_name=var) for var in memory_key
    ]

    prompt = OpenAIMultiFunctionsAgent.create_prompt(
        system_message=_system_message,  # type: ignore
        extra_prompt_messages=extra_prompt_messages,
    )

    memory = CombinedMemory(
        memories=memories_component or [])  # Optional을 list로 변환

    agent = OpenAIFunctionsAgent(llm=llm, tools=tools, prompt=prompt)

    return AgentExecutor(
        agent=agent,
        tools=tools,
        memory=memory,
        verbose=True,
        return_intermediate_steps=True,
    )
YamonBot commented 8 months ago

As a result of my research, when accessing an API endpoint, a tweak must be applied first, but it seems that the agent executor is run immediately in a situation where the tweak is not applied.

YamonBot commented 8 months ago

Finally, I implemented combined memory. There is still a problem with sub-memories storing messages redundantly, but as a draft, progress has been made to make it work.

YamonBot commented 8 months ago

Modyfi yout

langflow/src/backend/langflow/interface/run.py

on get_memory_key to blow

def get_memory_key(langchain_object): """ Given a LangChain object, this function retrieves the current memory key from the object's memory attribute. It then checks if the key exists in a dictionary of known memory keys and returns the corresponding key, or None if the current key is not recognized. """ mem_key_dict = { "chat_history": "history", "history": "chat_history", }

# Check if memory_key attribute exists
if hasattr(langchain_object.memory, 'memory_key'):
    memory_key = langchain_object.memory.memory_key
    return mem_key_dict.get(memory_key)
else:
    return None  # or some other default value or action
stale[bot] commented 7 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.