microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
34.96k stars 5.06k forks source link

[Issue]: Non open AI tools #3472

Open oyentis opened 2 months ago

oyentis commented 2 months ago

Describe the issue

In my project, I am using the autogen framework. By the design of the project, I must use tools. However, unfortunately, they are not being called, when used with gemini model.

{
    "model": "gemini-1.5-pro-001",
    "api_key": "MY_API_KEY",
    "api_type": "google"
}

Everything works fine when I use gpt-3.5-turbo-0125. However, when I switch to gemini, the tools don’t get called.

I installed autogen and autogen[gemini] as recommended. Autogen version: 0.2.32

Please find code snippet:

test1.py

from typing import Annotated, Any
from autogen import AssistantAgent, UserProxyAgent

llm_config = {
    "config_list": [
        {
        "model": "gemini-1.5-pro-001",
        "api_key": "MY-API-KEY",
        "api_type": "google",
        }
    ],
}

def get_current_weather(location: str) -> dict:
    """Get the current weather in a given location"""

    import json
    print("HELLO ", location)

    if "tokyo" in location.lower():
        return json.dumps({"location": "Tokyo", "temperature": "10", "unit": "celsius"})
    elif "san francisco" in location.lower():
        return json.dumps({"location": "San Francisco", "temperature": "72", "unit": "fahrenheit"})
    elif "paris" in location.lower():
        return json.dumps({"location": "Paris", "temperature": "22", "unit": "celsius"})
    else:
        return json.dumps({"location": location, "temperature": "unknown"})

assistant = AssistantAgent(
    "assistant",
    description="You are helpful AI assistant. You can detect location",
    llm_config=llm_config
    )
user_proxy = UserProxyAgent(
    name="User",
    description="You are able to get weather by location",
    llm_config=False,
    is_termination_msg=lambda msg: msg.get("content") is not None and "TERMINATE" in msg["content"],
    human_input_mode="NEVER",

)
assistant.register_for_llm(name="get_current_weather", description="allow to get current weather in location")(get_current_weather)
user_proxy.register_for_execution(name="get_current_weather")(get_current_weather)

response = user_proxy.initiate_chat(
    assistant,
    message="What's the weather in San Francisco? ",
    max_turns=10

    )
print(response)

Running this snippet, I got conversation without calling tools In case when I call genal directly (version '0.7.2')

test2.py

import google.generativeai as genai
import os

genai.configure(api_key="MY_API_KEY")

def get_current_weather(location: str) -> dict:
    """Get the current weather in a given location"""

    import json
    print("HELLO ", location)

    # return "Cool weahther"
    if "tokyo" in location.lower():
        return json.dumps({"location": "Tokyo", "temperature": "10", "unit": "celsius"})
    elif "san francisco" in location.lower():
        return json.dumps({"location": "San Francisco", "temperature": "72", "unit": "fahrenheit"})
    elif "paris" in location.lower():
        return json.dumps({"location": "Paris", "temperature": "22", "unit": "celsius"})
    else:
        return json.dumps({"location": location, "temperature": "unknown"})

model = genai.GenerativeModel(model_name='gemini-1.5-flash-latest',
                              tools=[get_current_weather])

chat = model.start_chat(enable_automatic_function_calling=True)

response = chat.send_message("What's the weather in San Francisco?")

print(response.text)

result

HELLO San Francisco The current weather in San Francisco is 72 degrees Fahrenheit.

To summaries : · Does autogen supports tools in gemini or is this support planned for future releases? · Is there any way to use tools in autogen with gemeni now?

Steps to reproduce

test1

  1. Install autogen
    pip install pyautogen
    pip install pyautogen[gemini,retrievechat,lmm]
  2. Run snippets
    python3 test1.py

test2

  1. Install the Gemini API SDK
    pip install -q -U google-generativeai
  2. Run snippet
    python3 test2.py

Screenshots and logs

Autogen version: 0.2.32 genal version: 0.7.2

Additional Information

used resources https://ai.google.dev/gemini-api/docs/function-calling/tutorial?lang=python https://microsoft.github.io/autogen/docs/topics/non-openai-models/cloud-gemini_vertexai

KhaoticMind commented 2 months ago

I've also perceived this issue. Using OpenAi models the tools get called, but not using Gemini. Is there any plans to add support to tool calling on gemini with autogen?