Closed snehankekre closed 10 months ago
Hey @snehankekre This would be a great addition!
Certainly! Thank you for the feedback 😃
With ac8567a, the event loop creation and response generation are now centralized as two functions in agent_utils.py
that are called from the home page and the generated agent page.
Is this what you had in mind?
I've resolved all conflicts. Looks ready to merge barring any blocking feedback @jerryjliu 🤞
Closing due to being unable to resolve conflicts with main
repeatedly.
🔍 Description
Closes #14.
LlamaIndex chat engines support streaming responses. This PR implements streaming responses to the Streamlit frontend using
.stream_chat(prompt)
instead of.chat(prompt)
.cc @jerryjliu @carolinedlu
✨ After
https://github.com/run-llama/rags/assets/20672874/df6be856-0ec4-4b08-9b3f-11d91a21bf02
😐 Before
https://github.com/run-llama/rags/assets/20672874/5edef67f-3949-4426-9edc-8a6ecfe3e86e