Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.91k stars 911 forks source link

Long Question Response Delay and Frontend Update Issue #699

Open saimurali52944 opened 8 months ago

saimurali52944 commented 8 months ago

I am encountering a notable issue while running the ChainLit app with GPT-4. Specifically, when posing longer questions that require additional processing time, there is a delay in receiving a response from GPT-4. However, even after obtaining the response, the frontend at localhost:8000 does not reflect the updated answer. The delay in frontend update becomes more apparent with more complex queries.

MahrRah commented 7 months ago

I am seeing the same issue. Is this already resolved somehow?

tpatel commented 7 months ago

Do you have a code example? I'd like to explore what is causing the delay.

MahrRah commented 6 months ago

@tpatel Here is a slightly artificial example sample where I see the issue triggered quiet often: large_response_demo.py

The initial message i used was this:

Generate a basic boilerplate API for a bookstore using FastAPI. Consider the following endpoints: create, delete, get_all_books, update.

The books should be stored in an in-memory dictionary, with id, author, publishing year, and title. Make sure to cover basic error handling and logging using the logger library.

The response will be logged but not displayed in the UI.

Hope that helps.