microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
34.72k stars 5.01k forks source link

[Bug]: Using reviewer agent (AssistantAgent) with claude-3-5-sonnet-20240620 crashes #3162

Open gilko1981 opened 4 months ago

gilko1981 commented 4 months ago

Describe the bug

The agent fails only when configured to llm=claude-3-5-sonnet-20240620. when using other configs, such as gpt-4o the Agent functions as expected. When assigning the claude-3-5-sonnet-20240620 model to a different agent, the agent functions as expected.

When using the reviewer agent (AssistantAgent) with claude-3-5-sonnet-20240620 crashes. I have a simple group chat, round-robin - [analyzer, analyzer_assistant, executor, reviewer] When configuring the reviewer agent with claude-3-5-sonnet-20240620, the execution crashes with:

raise self._make_status_error_from_response(err.response) from None

anthropic.BadRequestError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'messages: Unexpected role "function". Allowed roles are "user" or "assistant". For instructions on how to use tools, see https://docs.anthropic.com/en/docs/tool-use.'}}

Steps to reproduce

  1. Assigning claude-3-5-sonnet-20240620 model to the Analyzer Agent (only this model, only to this agent)
  2. Activating group chat, round-robin - [analyzer (simple llm call), analyzer_assistant (prepare function call), executor (function call), reviewer (simple llm call)]

Model Used

claude-3-5-sonnet-20240620

Expected Behavior

Normal group chat execution

Screenshots and logs

Traceback (most recent call last): File "/Users/gil/Project personal/autogen_new/blogs/create_spider_config.py", line 62, in calc_config blog_articles_patterns_results = retrieve_xpath_dict(input_htmls=[input_main_html], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/resolve_xpath_set.py", line 51, in retrieve_xpath_dict manager.initiate_chat( File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1018, in initiate_chat self.send(msg2send, recipient, silent=silent) File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 655, in send recipient.receive(message, self, request_reply, silent) File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 820, in receive self.send(reply, sender, silent=silent) File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 655, in send recipient.receive(message, self, request_reply, silent) File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 818, in receive reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1972, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/groupchat.py", line 1052, in run_chat reply = speaker.generate_reply(sender=self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1972, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1340, in generate_oai_reply extracted_response = self._generate_oai_reply_from_client( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1359, in _generate_oai_reply_from_client response = llm_client.create( ^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/oai/client.py", line 722, in create response = client.create(params) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/autogen/oai/anthropic.py", line 141, in create response = self._client.messages.create(*anthropic_params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/anthropic/_utils/_utils.py", line 277, in wrapper return func(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/anthropic/resources/messages.py", line 902, in create return self._post( ^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/anthropic/_base_client.py", line 1266, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/anthropic/_base_client.py", line 942, in request return self._request( ^^^^^^^^^^^^^^ File "/Users/gil/Project personal/autogen_new/.venv/lib/python3.12/site-packages/anthropic/_base_client.py", line 1046, in _request raise self._make_status_error_from_response(err.response) from None anthropic.BadRequestError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'messages: Unexpected role "function". Allowed roles are "user" or "assistant". For instructions on how to use tools, see https://docs.anthropic.com/en/docs/tool-use.'}}

Additional Information

No response

### Tasks
gilko1981 commented 4 months ago

Seems like this this the problem:

102-DEBUG-Requestoptions: { 'method': 'post', 'url': '/v1/messages', 'timeout': 600, 'files': None, 'json_data': { 'max_tokens': 4096, 'messages': [ { 'content': 'Please continue.', 'role': 'user' }, { 'content': "\n inputs:\n {'articles': 'this is the title'}\n\n exatracted_items:\n [{'articles': []}]\n ", 'role': 'function' }, { 'content': 'Please continue.', 'role': 'user' } ], 'model': 'claude-3-5-sonnet-20240620', 'stream': False, 'system': 'some prompt' }

Hk669 commented 4 months ago

@gilko1981 can you give us the setup used to reproduce this bug.

gilko1981 commented 4 months ago

@Hk669

    participationg_agents = [analyzer, analyzer_assistant, executor, reviewr]
    groupchat = autogen.GroupChat(
        agents=participationg_agents,
        messages=[],
        max_round=max_rounds,
        func_call_filter=False,
        speaker_selection_method='round_robin')

 analyzer =  autogen.AssistantAgent(
            name=analyzer_agent_name,
            human_input_mode="NEVER",
            default_auto_reply="...",
            system_message = f""""""",
           llm_config=claude_3_5_sonnet_config)

analyzer_assistant = autogen.AssistantAgent(
        name=analyzer_assistant_agent_name,
        human_input_mode="NEVER",
        system_message=f"""""",
        llm_config={**gpt4o_config, **{'functions': functions}}
    )

executor = autogen.UserProxyAgent(
        name="Executor",
        system_message=f"",
        human_input_mode="NEVER",
        code_execution_config={
            "last_n_messages": 1,
            "work_dir": ".",
            "use_docker": False,
        },
        function_map={function_definition.get('name'): function},
    )

reviewr = autogen.AssistantAgent(
        name=reviewr_agent_name,
        human_input_mode="NEVER",
        default_auto_reply="...",
        system_message=f""""""",
        llm_config=claude_3_5_sonnet_config,
        is_termination_msg=lambda x: re.sub(r'[^a-zA-Z0-9]', '', x.get("content", "").rstrip()).endswith("TERMINATE"),
    )