Closed LarsAC closed 1 month ago
It looks like you’re running into an issue with the OpenAI API where the role parameter is set to 'tool', but the API only accepts 'user' or 'assistant' as valid values for this parameter. To resolve this, you’ll need to locate the part of your code where the API request is being made and make sure that the role is set to either 'user' or 'assistant' depending on what you’re trying to achieve.
@LarsAC @PersonaDev I've encountered the same issue and can reproduce it. after enabling debug output, I found that the gptassistantagent is in fact generating a message with role value = 'tool' while using groupchat. I'm not specifically setting any messages, I'm simply calling initiate_chat for groupchat and then conversing. the tool that my agent is utilizing is the openai assistant file search, configured in my code per below. I have two gptassistant agents along with 4 conversable agents. I've noticed this error doesn't always occur for the same gptassistant in a given conversation, but it does always occur for one of them. please let me know any thoughts on debugging. i'll dig into conversable_agent.py and gpt_assistant_agent.py in the meantime to try to find the role assignment issue. based on the other related open issues, seems it might be unrelated to gptassistant agent, and a general issue with tool calling agents in groupchat.
Thanks Evan
initiate groupchat:
groupchat_result = user_proxy.initiate_chat( manager, message=task2, )
I followed this guide for configuration: https://microsoft.github.io/autogen/docs/topics/openai-assistant/gpt_assistant_agent
my config:
assistant_config = { "tools": [ {"type": "file_search"}, ], "tool_resources": { "file_search": { "vector_store_ids": ["$vector_store.id"] } } }
content_manager = GPTAssistantAgent( name="content_manager", description="provide documents for strategy", llm_config={ "config_list": llm_config_gpt_4o["config_list"], "assistant_id": "asst_wmOpxxxxxxxxxxxxxxx" }, assistant_config=assistant_config, )
ERROR logging:
DEBUG:openai._base_client:HTTP Response: POST https://api.openai.com/v1/threads/thread_EOMfr5UY2sOG7jrOAYn9CP27/messages "200 OK" Headers({'date': 'Sun, 18 Aug 2024 22:32:57 GMT', 'content-type': 'application/json', 'transfer-encoding': 'chunked', 'connection': 'keep-alive', 'openai-version': '2020-10-01', 'openai-organization': 'user-odczqxrlslggkjmvuya9yqaq', 'x-request-id': 'req_20f00ee86f9343181d7f642062e24f9d', 'openai-processing-ms': '134', 'strict-transport-security': 'max-age=15552000; includeSubDomains; preload', 'cf-cache-status': 'DYNAMIC', 'x-content-type-options': 'nosniff', 'server': 'cloudflare', 'cf-ray': '8b555cdfead72a9a-LAX', 'content-encoding': 'gzip', 'alt-svc': 'h3=":443"; ma=86400'}) DEBUG:openai._base_client:request_id: req_20f00ee86f934xxxxxxxx DEBUG:openai._base_client:Request options: {'method': 'post', 'url': '/threads/thread_EOMfr5UY2sOG7jrOAYn9CP27/messages', 'headers': {'OpenAI-Beta': 'assistants=v2'}, 'files': None, 'json_data': {'content': '"Neuralink\'s ..... Overall, Neuralink is at the forefront of the BCI industry, driving innovation and pushing the boundaries of what is possible with neurotechnology."', 'role': 'tool'}} DEBUG:openai._base_client:Sending HTTP Request: POST https://api.openai.com/v1/threads/thread_EOMfr5UY2sOG7jrOAYn9CP27/messages.
Describe the bug
I have put together a small team of agents (user_proxy, two researchers, and a data analyst). The researchers are
AssistantAgent
s, the data analyst is aGPTAssistantAgent
with thecode_interpreter
tool.Using a sequential chat mode (
user_proxy.initiate_chats()
) the conversation terminates fine. When I switch to using a GroupChat though, the chat aborts upon trying to talk to the data_analyst:Steps to reproduce
No response
Model Used
Currently using gpt-4o, but does not seem model related.
Expected Behavior
Conversation should run smooth with out error.
Screenshots and logs
No response
Additional Information
pyautogen==0.2.33 openai==1.37.1 Python 3.11.9
I went through the issues #3164 and #960. While they seem somewhat related I think this error has a different origin.