awslabs / multi-agent-orchestrator

Flexible and powerful framework for managing multiple AI agents and handling complex conversations
https://awslabs.github.io/multi-agent-orchestrator/
Apache License 2.0
115 stars 22 forks source link

Agent with toolConfig does not works with Streaming #49

Closed hghandri closed 2 days ago

hghandri commented 3 days ago

Expected Behaviour

Hi,

I've followed the documentation for Streaming activation on Agent. It seems the streaming mode works with Agent does not have tool_config. Once we set tool_config all LLM request are made without streaming mode

Current Behaviour

When we use tool_config, we could not see the message sent by assistant but only tooUse execution. Sometime when the LLM ask for more detail to achieve a task, we could not see the message

Code snippet

File: agent/bedrock_llm_agent.py:L120

if self.tool_config:

   continue_with_tools = True
   final_message: ConversationMessage = {'role': ParticipantRole.USER.value, 'content': []}
   max_recursions = self.tool_config.get('toolMaxRecursions', self.default_max_recursions)

   while continue_with_tools and max_recursions > 0:

      bedrock_response = await self.handle_single_response(converse_cmd)

Possible Solution

I will push PR

Steps to Reproduce

Just configure function calling with agent and Streaming mode = True

brnaba-aws commented 2 days ago

fixed by #50