langchain-ai / langserve

LangServe 🦜️🏓
Other
1.79k stars 189 forks source link

May be difficult to distinguish chat history from each users when wrap conversional agent by langserve #275

Open ChengyangDu opened 7 months ago

ChengyangDu commented 7 months ago

ConversationBufferMemory is useful in conversational agents, like codes below

memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

chain = initialize_agent(tools, llm,
    agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
    verbose=True,
    memory=memory,
)

However, when integrated with langserve, it may be difficult to distinguish the context of each users. Is there any good practice for such problem under langserve usage?

eyurtsev commented 7 months ago

Hi @ChengyangDu , i still need to add an example on how to do it -- haven't managed to get around to it yet, targeting next week (other users have asked about this as well)

What you're looking for is this: https://api.python.langchain.com/en/latest/runnables/langchain_core.runnables.history.RunnableWithMessageHistory.html#langchain_core.runnables.history.RunnableWithMessageHistory

It'll need to be combined with a per request modifier in add routes that helps to resolve the raw request object to a session id that corresponds to the given user. You can check how it's done in open gpts for inspiration

https://github.com/langchain-ai/opengpts

(Or you can wait for a few days -- and I'll try to post a more self contained example)

ChengyangDu commented 7 months ago

Sure. I'll wait and see. In a production-ready setup , I believe two features may need support for conversational memory:

  1. a session to distinguish users.
  2. a custom memory based on some storage middleware (rather than local memory), e.g. Redis
ChengyangDu commented 7 months ago

Hi @ChengyangDu , i still need to add an example on how to do it -- haven't managed to get around to it yet, targeting next week (other users have asked about this as well)

What you're looking for is this: https://api.python.langchain.com/en/latest/runnables/langchain_core.runnables.history.RunnableWithMessageHistory.html#langchain_core.runnables.history.RunnableWithMessageHistory

It'll need to be combined with a per request modifier in add routes that helps to resolve the raw request object to a session id that corresponds to the given user. You can check how it's done in open gpts for inspiration

https://github.com/langchain-ai/opengpts

(Or you can wait for a few days -- and I'll try to post a more self contained example)

Hi! Based on your info provided. I've tried to distinguish histories by session by adopting RunnableWithMessageHistory & RedisChatMessageHistory.

However, such methods seems not support recording intermediate_steps, which is conflict with 'hwchase17/react-chat'. In the end, codes below will fail

prompt = hub.pull("hwchase17/react-chat")
prompt = prompt.partial(
    tools=render_text_description(tools),
    tool_names=", ".join([t.name for t in tools]),
)

llm_with_stop = llm.bind(stop=["\nObservation"])

agent = (
    {
        "input": lambda x: x["input"],
        "agent_scratchpad": lambda x: format_log_to_str(x["intermediate_steps"]),
        "chat_history": lambda x: x["chat_history"],
    }
    | prompt
    | llm_with_stop
    | ReActSingleInputOutputParser()
)
chain_with_history = RunnableWithMessageHistory(
    agent,
    lambda session_id: RedisChatMessageHistory(session_id, url="redis://localhost:6379/0"),
    input_messages_key="input",
    history_messages_key="chat_history",
)

stack info:

  File "assistant_chain_distributed.py", line 180, in <lambda>
    "agent_scratchpad": lambda x: format_log_to_str(x["intermediate_steps"]),
KeyError: 'intermediate_steps'

So any sugestions of that? Great thx!

harrison001 commented 6 months ago

Merry Christmas. I had the same problem. Any update on this ?

harrison001 commented 6 months ago

MemGPT: The Future of LLMs with Unlimited Memory https://github.com/cpacker/MemGPT. strongly recommend introduce this one.

ZohaibRamzan commented 1 month ago

looking for the solution!!! Anyone able to manage session id based chat history or pipeline state(langgraph).