I've followed the documentation for Streaming activation on Agent.
It seems the streaming mode works with Agent does not have tool_config. Once we set tool_config all LLM request are made without streaming mode
Current Behaviour
When we use tool_config, we could not see the message sent by assistant but only tooUse execution.
Sometime when the LLM ask for more detail to achieve a task, we could not see the message
Code snippet
File: agent/bedrock_llm_agent.py:L120
if self.tool_config:
continue_with_tools = True
final_message: ConversationMessage = {'role': ParticipantRole.USER.value, 'content': []}
max_recursions = self.tool_config.get('toolMaxRecursions', self.default_max_recursions)
while continue_with_tools and max_recursions > 0:
bedrock_response = await self.handle_single_response(converse_cmd)
Possible Solution
I will push PR
Steps to Reproduce
Just configure function calling with agent and Streaming mode = True
Expected Behaviour
Hi,
I've followed the documentation for Streaming activation on Agent. It seems the streaming mode works with Agent does not have tool_config. Once we set tool_config all LLM request are made without streaming mode
Current Behaviour
When we use tool_config, we could not see the message sent by assistant but only tooUse execution. Sometime when the LLM ask for more detail to achieve a task, we could not see the message
Code snippet
Possible Solution
I will push PR
Steps to Reproduce
Just configure function calling with agent and Streaming mode = True