Closed bllchmbrs closed 11 months ago
Hi @bllchmbrs! Totally agree on the working example point. The documentation is definitely lacking at the moment.
About the chat history issue, I initially intended the API endpoints to be stateless. To allow langchain's built-in memory work as expected, we'll need the endpoint to be stateful. maybe a websocket is more suited for your use-case. Have you tried using it?
If the framework doesn't support it, it's something I can look into.
closing this issue due to user inactivity.
you can view the new documentation on LangChainRouter
here: https://lanarky.ajndkr.com/learn/adapters/langchain/router/
If you are interested to know more about the low-level modules, you can find the documentation for it here: https://lanarky.ajndkr.com/learn/adapters/langchain/fastapi/
please reopen this issue if you'd like to discuss more.
I certainly could be doing something wrong, but there's so much "magic" in the
LangchainRouter
I don't know what I should be doing.It works fine when I supply the chat history, I just don't want to manage that client side and don't see the value in doing so when Langchain should have it integrated.
Scenario
Here is my LangchainRouter:
Here is my chat model definition:
When I make the API Call:
Actual result
Expected result
I would expect that it would use the chain's memory and not require the API to supply it.
Acceptance criteria