Open BabyCNM opened 1 month ago
NOTE: the test case is failing because of API key. Let me know how to address it.
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-mocko***************************************only. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
NOTE: the test case is failing because of API key. Let me know how to address it.
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-mocko***************************************only. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
We can just skip the tests for this, If it is tested locally on all OS.
@BabyCNM im on windows, let me know if it is already tested.
@Hk669 Thanks! I have tested in Mac OS.
@BabyCNM im on windows, let me know if it is already tested.
@Hk669 did you test it on windows? Thank you!
@Hk669 did you test it on windows? Thank you!
I couldn't test it, let me do it tomorrow. But it looks good to me though.
The LMM tests should be skiped when skip-openai is specified. Please add a skip condition similar to other tests in contrib tests, and add the test back to contrib-openai CI. @qingyun-wu I think the LMM tests were removed from contrib-openai CI before. Could you chime in if there's an issue that prevents this test to be added to CI?
The LMM tests should be skiped when skip-openai is specified. Please add a skip condition similar to other tests in contrib tests, and add the test back to contrib-openai CI. @qingyun-wu I think the LMM tests were removed from contrib-openai CI before. Could you chime in if there's an issue that prevents this test to be added to CI?
Ok. Will take a look at this and get back soon.
Why are these changes needed?
This PR only has one-line change (as major change) and a new test case. However, it addresses several issues mentioned before: https://github.com/microsoft/autogen/issues/2550 https://github.com/microsoft/autogen/issues/3507
Details:
In 2024, we introduced the function
_generate_oai_reply_from_client
in Conversable Agent (https://github.com/microsoft/autogen/pull/1575), which can handle function calling, model dumping, reflection, and many other great features. However, this change is not updated in the MultimodalConversableAgent, which causes lots of issues afterwards for the multimodal agent in function calling, group chat, and many other locations.The fix is simple.
Updates:
_generate_oai_reply_from_client
.gpt-4-turbo
as the default model instead ofgpt-4-vision-preview
, as the preview model is deprecated by OpenAI now.Related issue number
Checks