microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
31.66k stars 4.6k forks source link

[Bug]: claude supports problem---not 'system' as an input message role #2246

Open 13331112522 opened 6 months ago

13331112522 commented 6 months ago

Describe the bug

pyautogen 0.2.21 in mac air when using Groupchat, raise this error:

openai.BadRequestError: Error code: 400 - {'error': {'message': 'messages: Unexpected role "system". The Messages API accepts a top-level system parameter, not "system" as an input message role. (request id: 2024040115252883577725946478791)', 'type': 'invalid_request_error', 'param': '', 'code': None}}

Steps to reproduce

No response

Model Used

claude 2.1

Expected Behavior

No response

Screenshots and logs


No default IOStream has been set, defaulting to IOConsole. Traceback (most recent call last): File "/Users/zhouql1978/dev/xplorer/xplorer-new.py", line 315, in chat_results=user_proxy.initiate_chats( File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 1183, in initiate_chats self._finished_chats = initiate_chats(_chat_queue) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/chat.py", line 179, in initiate_chats chat_res = sender.initiate_chat(chat_info) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 987, in initiate_chat self.send(msg2send, recipient, silent=silent) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 628, in send recipient.receive(message, self, request_reply, silent) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 788, in receive reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 1909, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/groupchat.py", line 618, in run_chat speaker = groupchat.select_speaker(speaker, self) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/groupchat.py", line 434, in select_speaker final, name = selector.generate_oai_reply(messages) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 1275, in generate_oai_reply extracted_response = self._generate_oai_reply_from_client( File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 1294, in _generate_oai_reply_from_client response = llm_client.create( File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/oai/client.py", line 626, in create response = client.create(params) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/autogen/oai/client.py", line 279, in create response = completions.create(params) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper return func(*args, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 667, in create return self._post( File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/openai/_base_client.py", line 1208, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/openai/_base_client.py", line 897, in request return self._request( File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/openai/_base_client.py", line 988, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': 'messages: Unexpected role "system". The Messages API accepts a top-level system parameter, not "system" as an input message role. (request id: 2024040115252883577725946478791)', 'type': 'invalid_request_error', 'param': '', 'code': None}}

Additional Information

python 3.10 autogen 0.2.21 mac air M2

13331112522 commented 6 months ago

also function calling problem with:

autogen.agentchat.register_function( save_db, caller=savor, executor=writer, description="save document into vectorDB", )

just can't work. savor have not called the function.

ekzhu commented 6 months ago

There are two separate issues here:

For the unexpected role issue, we have something come up in the next release: https://microsoft.github.io/autogen/docs/tutorial/conversation-patterns#changing-the-select-speaker-role-name.

Could you please also update your PR description with your code, and your Claude set up, are you using Anthropic API or proxy?

For your issue with function call, it is hard to see the cause without your code.

levscaut commented 6 months ago

Hey I've improved my claude example with system message with a new PR. The claude API without system message is already avaliable following this notebook. Looking forward to your feedbacks!