microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
30.81k stars 4.49k forks source link

[Bug]: Not understanding warning message from autogen during ConverseableAgent creation. #3157

Open garyjkuehn opened 1 month ago

garyjkuehn commented 1 month ago

Describe the bug

The following warning is issued by autogen during the ConverseableAgent creation even though the code functions as expected.

Sample response from the two agent conversation relying upon a local function to return weather in order to determine tour suggestions.

[autogen.oai.client: 07-17 14:33:30] {164} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model. user_proxy (to assistant2): What are some interesting things to do while in Washington DC today?

USING AUTO REPLY... assistant2 (to user_proxy): Suggested tool call (call_GZJyzpI9Rf9KtNvEEL0slu16): get_weather Arguments: {"location":"Washington DC"}

user_proxy (to assistant2): Response from calling tool (call_GZJyzpI9Rf9KtNvEEL0slu16) Heavy percipitation is expected throughout the day in Washington DC. T-Storms are very possible.



USING AUTO REPLY... assistant2 (to user_proxy):

Here are five suggested indoor activities to enjoy in Washington DC due to the heavy precipitation and possible thunderstorms:

  1. Visit the Smithsonian museums.
  2. Explore the National Gallery of Art.
  3. Tour the U.S. Capitol Visitor Center.
  4. Discover the Library of Congress.
  5. Experience the International Spy Museum.

Feel free to let me know if you need more recommendations or information!

Steps to reproduce

The llm_config contains the model and api_key in this format: model = os.getenv('OPENAI_MODEL') api_key = os.getenv('OPENAI_API_KEY') config_list = [ { "model": model, "api_key": api_key, } ] llm_config = { "config_list": config_list}

assistant2 = ConversableAgent( name="assistant2", system_message="""You are a helpful tour guide assistant that can recommend points of interest based upon ...... Return 'TERMINATE' when the task is done.""", llm_config=llm_config, ) print(assistant2.print_usage_summary) print(assistant2.llm_config)

Model Used

gpt-3.5-turbo

Expected Behavior

The behavior is correct, I can't figure out how to clear the warning.

Screenshots and logs

No response

Additional Information

AutoGen Version: 0.2.32 Operating System: macOS Montery 12.6.2 Python Version: 3.12.4

koorukuroo commented 1 month ago

Check your API_KEY. It probably starts with sk-None..., which is probably because it's supposed to have a specific value, like sk-proj..., so creating an API_KEY that belongs to a project rather than a legacy key should fix it.

Of course, this is a bug! :D

garyjkuehn commented 1 month ago

Created a project key and that resolved this particular warning. Thank you

marcocello commented 1 month ago

Hello I am having the same issue, I am trying both with "User API keys (Legacy)" and also with the new Project API keys ("sk-proj-...)