Closed audvin closed 1 year ago
Hi! @audvin We already have a PR open towards this! #16.
It’s still in progress.
An apology, I've been a bit busy but I'll get back to it this weekend
Hi! We already have a PR open towards this! #16 https://github.com/ajndkr/fastapi-async-langchain/pull/16.
It’s still in progress.
— Reply to this email directly, view it on GitHub https://github.com/ajndkr/fastapi-async-langchain/issues/29#issuecomment-1536941249, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFJ444R3YEIMJJFCE3ZGW3XEWMIRANCNFSM6AAAAAAXXVU2KU . You are receiving this because you are subscribed to this thread.Message ID: @.***>
No rush!
That's awesome to hear! 🙏
@audvin streaming response support for ConversationalRetrievalChain has been added in 0.4.4
. It should be available via pip soon!
let us know if it works!
@ajndkr if i want to call the streaming handler i get the crux one thing i wanna know if i am not mainting history on the frontend side is there a way to get the whole response after its done streaming to commit it back to the db as history ?
Would this work together with ConversationalRetrievalChain?
E.g.
Ref: https://langchain-fanyi.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html?highlight=ConversationalRetrievalChain#conversationalretrievalchain-with-streaming-to-stdout