Closed SimonB97 closed 1 year ago
What happens is that you see the LLM response (first message), and the langchain agent final answer(second message). Usually it is supposed to be indented but since your are passing the callback handler to only the LLM, there is nothing to indent the message in.
remove this line callbacks=[cl.ChainlitCallbackHandler()]
in the LLM definition and change langchain_run like this:
@cl.langchain_run
async def run(agent, input):
# Since the agent is sync, we need to make it async
res = await cl.make_async(agent.run)(input, callbacks=[cl.ChainlitCallbackHandler()])
await cl.Message(content=res).send()
when i do this, the response isn't streamed for me.. but it removed the double output of the response.
@willydouhard sorry not sure if you see notifications if i answer normally
No worries. This is the correct behavior. By default langchain does not stream the final answer but only the intermediary steps. It is a known issue and we will try to address it in the next releases.
Ahh, i see. Thank you!
Hi, your package is awesome, thank you for this!
Unfortunately, i'm having an issue where the resonses are displayed two times each when using the ChainlitCallbackHandler:
i have to use the callback handler to be able to get streaming output with an initially sync agent from langchain wich i made async by applying the fix (last function below) mentioned in your docs:
How can i stop the double output?