Open CryptVenture opened 6 months ago
What version of Chainlit are you using? Also, are you saying the response is not entirely visible (we have a feature to shrink long messages and you have a button to expand those responses) or the response is missing tokens entirely?
A screenshot would help!
I have the same issue. If the answer provided by LLM covers more than one line the final answer shown by the UI includes only the first one. chainlit version 1.0.101 farm-haystack version 1.23.0.
complete answer on shell
truncated
Can you provide the code to reproduce?
this is the code slice of interest:
`retriever = get_retriever() agent = get_agent(retriever) cl.HaystackAgentCallbackHandler(agent)
@cl.author_rename def rename(orig_author: str): rename_dict = {"custom-at-query-time": "Agent Step"} return rename_dict.get(orig_author, orig_author)
@cl.on_message async def main(message: cl.Message): response = await cl.make_async(agent.run)(message.content) await cl.Message(author="Agent", content=response["answers"][0].answer).send()
Note: If I set the paramenter "stream_final_answer=True" in this way:
cl.HaystackAgentCallbackHandler(agent, stream_final_answer=True)
the answer is complete, but the first row is repeated
If this is still something you are working on, have you tried using stream_final_answer=True
and removing the line
await cl.Message(author="Agent", content=response["answers"][0].answer).send()
I am attempting to create a chainlit-based chat application using Embedchain. Everything is working except the responses coming through chainlit are cut off / truncated and never seem to be displayed in their entirety. Is there a setting of some description that would make sure that all the streaming content is displayed?