microsoft / autogen

A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
28.52k stars 4.17k forks source link

[Bug]: gemini model not working #2740

Closed asiletto closed 1 month ago

asiletto commented 1 month ago

Describe the bug

I am running this cookbook https://github.com/microsoft/autogen/blob/main/website/docs/topics/non-openai-models/cloud-gemini.ipynb

The only thing i changed is the gemini api key, using one I have generated

I get this error:

Logging session ID: 6b97b3b5-27b7-4f5a-b96d-6e16478fb7f6
Traceback (most recent call last):
  File "C:\Users\XXX\workspace\agents\ag-joke.py", line 37, in <module>
    gemini = autogen.AssistantAgent(
             ^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\Anaconda3\envs\autogen\Lib\site-packages\autogen\agentchat\assistant_agent.py", line 62, in __init__
    super().__init__(
  File "C:\Users\XXX\Anaconda3\envs\autogen\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 154, in __init__
    self._validate_llm_config(llm_config)
  File "C:\Users\XXX\Anaconda3\envs\autogen\Lib\site-packages\autogen\agentchat\conversable_agent.py", line 258, in _validate_llm_config
    self.client = None if self.llm_config is False else OpenAIWrapper(**self.llm_config)
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\Anaconda3\envs\autogen\Lib\site-packages\autogen\oai\client.py", line 383, in __init__
    self._register_default_client(config, openai_config)  # could modify the config
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\Anaconda3\envs\autogen\Lib\site-packages\autogen\oai\client.py", line 444, in _register_default_client
    log_new_client(client, self, openai_config)
                   ^^^^^^
UnboundLocalError: cannot access local variable 'client' where it is not associated with a value

Steps to reproduce

import os
import ctypes

kernel32 = ctypes.windll.kernel32
kernel32.SetConsoleMode(kernel32.GetStdHandle(-11), 7)

import autogen

logging_session_id = autogen.runtime_logging.start(config={"dbname": "logs2.db"})
print("Logging session ID: " + str(logging_session_id))

config_list = autogen.config_list_from_json(
    env_or_file="autogen.json",
    filter_dict={
        "model": ["gpt-35-turbo-16k"],
    },
)

config_list_gemini = autogen.config_list_from_json(
    env_or_file="autogen.json",
    filter_dict={
        "model": ["gemini-1.0-pro-002"],
    },
)

seed = 25

gpt = autogen.AssistantAgent(
    "GPT-3",
    system_message="""You should ask weird, tricky, and concise questions.
Ask the next question based on (by evolving) the previous one.""",
    llm_config={"config_list": config_list, "seed": seed},
    max_consecutive_auto_reply=3,
)

gemini = autogen.AssistantAgent(
    "Gemini-Pro",
    system_message="""Always answer questions within one sentence. """,
    #                      system_message="answer:",
    llm_config={"config_list": config_list_gemini, "seed": seed},
    max_consecutive_auto_reply=4,
)

gpt.initiate_chat(gemini, message="Do Transformers purchase auto insurance or health insurance?")

autogen.runtime_logging.stop()

Model Used

I tryed

{
        "model": "gemini-1.0-pro-002",
        "api_key": ".....",
        "api_type": "google"
    }

and

{
        "model": "gemini-pro",
        "api_key": ".....",
        "api_type": "google"
    }

Expected Behavior

No response

Screenshots and logs

No response

Additional Information

new anaconda env with python 3.12 and these libs:

annotated-types==0.7.0
anyio==4.3.0
cachetools==5.3.3
certifi==2024.2.2
charset-normalizer==3.3.2
colorama==0.4.6
diskcache==5.6.3
distro==1.9.0
docker==7.0.0
FLAML==2.1.2
google-ai-generativelanguage==0.6.4
google-api-core==2.19.0
google-api-python-client==2.129.0
google-auth==2.29.0
google-auth-httplib2==0.2.0
google-generativeai==0.5.4
googleapis-common-protos==1.63.0
grpcio==1.64.0
grpcio-status==1.62.2
h11==0.14.0
httpcore==1.0.5
httplib2==0.22.0
httpx==0.27.0
idna==3.7
numpy==1.26.4
openai==1.30.1
packaging==24.0
pillow==10.3.0
proto-plus==1.23.0
protobuf==4.25.3
pyasn1==0.6.0
pyasn1_modules==0.4.0
pyautogen==0.2.27
pydantic==2.7.1
pydantic_core==2.18.2
pyparsing==3.1.2
python-dotenv==1.0.1
pywin32==306
regex==2024.5.15
requests==2.32.1
rsa==4.9
setuptools==69.5.1
sniffio==1.3.1
termcolor==2.4.0
tiktoken==0.7.0
tqdm==4.66.4
typing_extensions==4.11.0
uritemplate==4.1.1
urllib3==2.2.1
wheel==0.43.0
ekzhu commented 1 month ago

@BeibinLi

BeibinLi commented 1 month ago

Thanks! Good catch. I believe the logging mechanism did not consider Gemini at all and hence caused this issue.

Check the latest PR to fix this issue. https://github.com/microsoft/autogen/pull/2749