Open saimurali52944 opened 8 months ago
I am seeing the same issue. Is this already resolved somehow?
Do you have a code example? I'd like to explore what is causing the delay.
@tpatel Here is a slightly artificial example sample where I see the issue triggered quiet often: large_response_demo.py
The initial message i used was this:
Generate a basic boilerplate API for a bookstore using FastAPI. Consider the following endpoints: create, delete, get_all_books, update.
The books should be stored in an in-memory dictionary, with id, author, publishing year, and title. Make sure to cover basic error handling and logging using the
logger
library.The response will be logged but not displayed in the UI.
Hope that helps.
I am encountering a notable issue while running the ChainLit app with GPT-4. Specifically, when posing longer questions that require additional processing time, there is a delay in receiving a response from GPT-4. However, even after obtaining the response, the frontend at localhost:8000 does not reflect the updated answer. The delay in frontend update becomes more apparent with more complex queries.