ajndkr / lanarky

The web framework for building LLM microservices
https://lanarky.ajndkr.com/
MIT License
975 stars 74 forks source link

ConversationalRetrievalChain #29

Closed audvin closed 1 year ago

audvin commented 1 year ago

Would this work together with ConversationalRetrievalChain?

E.g.

qa = ConversationalRetrievalChain(
    retriever=vectorstore.as_retriever(),
    combine_docs_chain=doc_chain,
    question_generator=question_generator,
    return_source_documents=False,
    get_chat_history=get_chat_history,
)

Ref: https://langchain-fanyi.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html?highlight=ConversationalRetrievalChain#conversationalretrievalchain-with-streaming-to-stdout

ajndkr commented 1 year ago

Hi! @audvin We already have a PR open towards this! #16.

It’s still in progress.

rogalvil commented 1 year ago

An apology, I've been a bit busy but I'll get back to it this weekend

Hi! We already have a PR open towards this! #16 https://github.com/ajndkr/fastapi-async-langchain/pull/16.

It’s still in progress.

— Reply to this email directly, view it on GitHub https://github.com/ajndkr/fastapi-async-langchain/issues/29#issuecomment-1536941249, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFJ444R3YEIMJJFCE3ZGW3XEWMIRANCNFSM6AAAAAAXXVU2KU . You are receiving this because you are subscribed to this thread.Message ID: @.***>

ajndkr commented 1 year ago

No rush!

audvin commented 1 year ago

That's awesome to hear! 🙏

ajndkr commented 1 year ago

@audvin streaming response support for ConversationalRetrievalChain has been added in 0.4.4. It should be available via pip soon!

let us know if it works!

tejeshbhalla commented 1 year ago

@ajndkr if i want to call the streaming handler i get the crux one thing i wanna know if i am not mainting history on the frontend side is there a way to get the whole response after its done streaming to commit it back to the db as history ?