Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.1k stars 777 forks source link

Chainlit responses being cut off #623

Open CryptVenture opened 6 months ago

CryptVenture commented 6 months ago

I am attempting to create a chainlit-based chat application using Embedchain. Everything is working except the responses coming through chainlit are cut off / truncated and never seem to be displayed in their entirety. Is there a setting of some description that would make sure that all the streaming content is displayed?

willydouhard commented 6 months ago

What version of Chainlit are you using? Also, are you saying the response is not entirely visible (we have a feature to shrink long messages and you have a button to expand those responses) or the response is missing tokens entirely?

A screenshot would help!

antoniodagata77 commented 5 months ago

I have the same issue. If the answer provided by LLM covers more than one line the final answer shown by the UI includes only the first one. chainlit version 1.0.101 farm-haystack version 1.23.0.

complete answer on shell

Screenshot 2024-01-16 165425

truncated

Screenshot 2024-01-16 165548
willydouhard commented 5 months ago

Can you provide the code to reproduce?

antoniodagata77 commented 5 months ago

this is the code slice of interest:

`retriever = get_retriever() agent = get_agent(retriever) cl.HaystackAgentCallbackHandler(agent)

@cl.author_rename def rename(orig_author: str): rename_dict = {"custom-at-query-time": "Agent Step"} return rename_dict.get(orig_author, orig_author)

@cl.on_message async def main(message: cl.Message): response = await cl.make_async(agent.run)(message.content) await cl.Message(author="Agent", content=response["answers"][0].answer).send()

Note: If I set the paramenter "stream_final_answer=True" in this way:

cl.HaystackAgentCallbackHandler(agent, stream_final_answer=True)

the answer is complete, but the first row is repeated

Screenshot 2024-01-17 120841
tpatel commented 4 months ago

If this is still something you are working on, have you tried using stream_final_answer=True and removing the line

await cl.Message(author="Agent", content=response["answers"][0].answer).send()