microsoft / autogen

A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
28.17k stars 4.11k forks source link

[Issue]: Function calling is not working properly for gemini model. #1198

Open Haripritamreddy opened 5 months ago

Haripritamreddy commented 5 months ago

Describe the issue

I have used litellm to get open ai compatable api for google's gemini pro model and used it in base_url.So when i tried function calling it is returning a json object but it is not the used format.I have tried different prompts but it still's not working.

Steps to reproduce

1.Step 1: import yfinance as yf import autogen

Configuration for OpenAI API

config_list = [ { 'model': 'gpt-3.5-turbo', 'api_key': 'anything', 'base_url': 'https://6ad3-35-245-36-6.ngrok-free.app', } ]

Autogen configuration

llm_config = { "functions": [ { "name": "get_stock_price", "description": "Get the current stock price for a given stock symbol.", "parameters": { "type": "object", "properties": { "symbol": { "type": "string", "description": "Stock symbol to get the price for.", } }, "required": ["symbol"], }, }, ], "config_list": config_list, "timeout": 120, }

Autogen agent initialization

chatbot = autogen.AssistantAgent( name="chatbot", system_message="For coding tasks, only use the functions you have been provided with. Reply TERMINATE when the task is done.", llm_config=llm_config, )

Function to get stock price

def get_stock_price(symbol): try:

Use yfinance to get stock price

    stock_data = yf.Ticker(symbol)
    current_price = stock_data.history(period="1d")['Close'].iloc[-1]
    return f"The current price of {symbol} is ${current_price:.2f}"
except Exception as e:
    return f"Error fetching stock price: {str(e)}"

Autogen user proxy initialization

user_proxy = autogen.UserProxyAgent( name="user_proxy", is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"), human_input_mode="NEVER", max_consecutive_auto_reply=10, code_execution_config={"work_dir": "coding"}, )

Register the get_stock_price function

user_proxy.register_function( function_map={ "get_stock_price": get_stock_price, } )

Start the conversation

user_proxy.initiate_chat( chatbot, message="Get the current stock price for symbol apple.", )

Screenshots and logs

image

Additional Information

No response

davorrunje commented 5 months ago

We are migrating code from function calls to tool calls. Do you have an option to use tool calls with Gemini?

Haripritamreddy commented 5 months ago

No It is not working.Unlike function calling which is partially working It is giving altleast json output.But when i use tool calls with litellm its throwing erros.

import autogen from typing import Literal from typing_extensions import Annotated

config_list = [ { 'model': 'gpt-3.5-turbo', 'api_key': 'anything', 'base_url': 'http://abce-35-231-22-211.ngrok-free.app', } ]

llm_config = { "config_list": config_list, "timeout": 120, }

currency_bot = autogen.AssistantAgent( name="currency_bot", system_message="For currency exchange tasks, only use the functions you have been provided with. Reply TERMINATE " "when the task is done.", llm_config=llm_config, )

user_proxy = autogen.UserProxyAgent( name="user_proxy", is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"), human_input_mode="NEVER", max_consecutive_auto_reply=10, )

CurrencySymbol = Literal["USD", "EUR"]

def exchange_rate(base_currency: CurrencySymbol, quote_currency: CurrencySymbol) -> float: if base_currency == quote_currency: return 1.0 elif base_currency == "USD" and quote_currency == "EUR": return 1 / 1.09 elif base_currency == "EUR" and quote_currency == "USD": return 1.1 else: raise ValueError(f"Unknown currencies {base_currency}, {quote_currency}")

@user_proxy.register_for_execution() @currency_bot.register_for_llm(description="Currency exchange calculator.") def currency_calculator( base_amount: Annotated[float, "Amount of currency in base_currency"], base_currency: Annotated[CurrencySymbol, "Base currency"] = "USD", quote_currency: Annotated[CurrencySymbol, "Quote currency"] = "EUR", ) -> str: quote_amount = exchange_rate(base_currency, quote_currency) * base_amount return f"{quote_amount} {quote_currency}"

start the conversation

user_proxy.initiate_chat( currency_bot, message="How much is 123.45 USD in EUR?", )

Error:

How much is 123.45 USD in EUR?


Traceback (most recent call last): File "c:\Users\Asus\Documents\autogen\function_test.py", line 60, in user_proxy.initiate_chat( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 621, in initiate_chat self.send(self.generate_init_message(context), recipient, silent=silent) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 398, in send recipient.receive(message, self, request_reply, silent) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 551, in receive reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 1191, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 708, in generate_oai_reply response = client.create( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\oai\client.py", line 261, in create response = self._completions_create(client, params) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\oai\client.py", line 359, in _completions_create response = completions.create(params) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_utils_utils.py", line 272, in wrapper return func(*args, kwargs) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai\resources\chat\completions.py", line 645, in create return self._post( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 1088, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 853, in request return self._request( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 916, in _request return self._retry_request( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 958, in _retry_request return self._request( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 916, in _request return self._retry_request( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 958, in _retry_request return self._request( File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 930, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Error code: 500 - {'detail': '\'functions\'\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 556, in completion\n optional_params = get_optional_params(\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 3198, in get_optional_params\n "tools", non_default_params.pop("functions")\nKeyError: \'functions\'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 215, in acompletion\n response = await loop.run_in_executor(None, func_with_context)\n File "/usr/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n result = self.fn(*self.args, *self.kwargs)\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2130, in wrapper\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2037, in wrapper\n result = original_function(args, kwargs)\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 1746, in completion\n raise exception_type(\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6628, in exception_type\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6603, in exception_type\n raise APIConnectionError(\nlitellm.exceptions.APIConnectionError: \'functions\'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.10/dist-packages/litellm/proxy/proxy_server.py", line 1464, in chat_completion\n response = await litellm.acompletion(*data)\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2366, in wrapper_async\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2258, in wrapper_async\n result = await original_function(args, **kwargs)\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 227, in acompletion\n raise exception_type(\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6628, in exception_type\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6596, in exception_type\n raise APIConnectionError(\nlitellm.exceptions.APIConnectionError: \'functions\'\n'}

davorrunje commented 5 months ago

It looks like we broke it with all of the latest changes :/ I opened an issue for it and will fix it soon https://github.com/microsoft/autogen/issues/1206

nileshtrivedi commented 1 month ago

Is function calling working now? It seems to fail with gemini-1.5-flash-latest for me. Here are Gemini's docs: https://ai.google.dev/gemini-api/docs/function-calling

davorrunje commented 3 weeks ago

@nileshtrivedi function calling is working, but it might be broken in your use case. Can you please create a detailed issue with a code example and mention me it? I'll take it over from there