Open sharrajesh opened 1 month ago
To add more context
This gist shows working streaming and non streaming code for my langchain agent with multiple tools returning dict when using create_openai_tools_agent https://gist.github.com/sharrajesh/1080af5a95ae9d7b83a8da46597b68e1
This gist show non working streaming code for my langchain agent with multiple tools returning dict when using create_react_agent https://gist.github.com/sharrajesh/765c0b6edfe991363675f45d467e3c93
Checked other resources
Description
When using
astream_events
with an agent that has multiple tools, if one of the tools returns a dictionary containing an answer and source documents, the streaming process fails. However, the same setup works correctly in non-streaming mode.Steps to Reproduce
astream_events
to stream the agent's output.Code to Reproduce
Error Message
Expected Behavior
The
astream_events
method should handle tool outputs that return dictionaries, just as it does in non-streaming mode.Actual Behavior
The streaming process fails with a ValidationError when a tool returns a dictionary.
Workaround
Returning a JSON string instead of a dictionary from the tool allows the streaming to work:
Environment
Model:
Operating System: (Please specify your OS)
Additional Context
This issue only occurs in streaming mode when using
astream_events
. The same code works correctly in non-streaming mode. It appears that theastream_events
method is not properly handling dictionary outputs from tools, possibly due to an issue in the event conversion process.The problem is reproducible with AWS Bedrock using the Claude 3 Sonnet model, but it may also affect other LLM providers and models.