microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
31.54k stars 4.59k forks source link

[Issue]: LiteLLM with Ollama function calling cannot find function #2548

Open patrickwasp opened 5 months ago

patrickwasp commented 5 months ago

Describe the issue

When I try to run the example listed at https://microsoft.github.io/autogen/docs/topics/non-openai-models/local-litellm-ollama#example-with-function-calling it can't find the function. The previous simpler example without function calling (https://microsoft.github.io/autogen/docs/topics/non-openai-models/local-litellm-ollama#example-with-function-calling) works as expected.

This is my environment: docker-compose.yaml

services:

  ollama:
    image: ollama/ollama:0.1.32
    tty: true
    restart: unless-stopped
    ports:
      - 11434:11434
    volumes:
      - ./data/ollama:/root/.ollama
      - ./config/ollama/modelfiles:/modelfiles
    environment:
      - ENV_VARIABLE=value
      - OLLAMA_DEBUG=true
      - LLAMA_TRACE=1
      - GGML_DEBUG=1
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [ gpu ]

  litellm:
    image: ghcr.io/berriai/litellm:main-v1.35.31
    restart: unless-stopped
    environment:
      - LITELLM_LOG=DEBUG
    ports:
      - 4000:4000
    volumes:
      - ./config/litellm/litellm_config.yaml:/app/config.yaml
    command: --config /app/config.yaml

Steps to reproduce

  1. copy the example into a local file.
  2. modify the config to point to the local llm system
local_llm_config = {
    "config_list": [
        {
            "model": "ollama/dolphincoder:8k",
            "api_key": "NotRequired",
            "base_url": "http://10.4.4.207:4000",
        }
    ],
    "cache_seed": None,  # Turns off caching, useful for testing different models
}
  1. run the local file

Screenshots and logs

output

python3.12 autogen_function_calling_example.py 
user_proxy (to chatbot):

How much is 123.45 EUR in USD?

--------------------------------------------------------------------------------
chatbot (to user_proxy):

***** Suggested tool call (call_68bb3d2d-100b-4d43-9dee-cd968f8eaa13):  *****
Arguments: 
{
  "name": "currency_calculator",
  "arguments": {
    "base_amount": 123.45,
    "quote_currency": "USD"
  }
}

*****************************************************************************

--------------------------------------------------------------------------------
user_proxy (to chatbot):

user_proxy (to chatbot):

***** Response from calling tool (call_68bb3d2d-100b-4d43-9dee-cd968f8eaa13) *****
Error: Function  not found.
**********************************************************************************

--------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/app/backend/examples/autogen_function_calling_example.py", line 79, in <module>
    res = user_proxy.initiate_chat(
          ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 991, in initiate_chat
    self.send(msg2send, recipient, silent=silent)
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 632, in send
    recipient.receive(message, self, request_reply, silent)
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 794, in receive
    self.send(reply, sender, silent=silent)
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 632, in send
    recipient.receive(message, self, request_reply, silent)
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 794, in receive
    self.send(reply, sender, silent=silent)
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 632, in send
    recipient.receive(message, self, request_reply, silent)
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 792, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1934, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1300, in generate_oai_reply
    extracted_response = self._generate_oai_reply_from_client(
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1319, in _generate_oai_reply_from_client
    response = llm_client.create(
               ^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/autogen/oai/client.py", line 638, in create
    response = client.create(params)
               ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/autogen/oai/client.py", line 285, in create
    response = completions.create(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 581, in create
    return self._post(
           ^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1233, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 922, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 998, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1046, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 998, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1046, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1013, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'list index out of range', 'type': 'None', 'param': 'None', 'code': 500}}

Additional Information

autogenstudio==0.0.56 python 3.12 ubuntu 22.04

ekzhu commented 5 months ago

The language model made an error in formulating the arguments to the function exchange_rate:

Arguments: 
{
  "name": "currency_calculator",
  "arguments": {
    "base_amount": 123.45,
    "quote_currency": "USD"
  }
}

This above is clearly incorrect. The correct argument in JSON should just the object in arguments above.

You can take a look at expected output of tool call messages: https://microsoft.github.io/autogen/docs/tutorial/tool-use#using-tool

cc @marklysze for awareness.

marklysze commented 5 months ago

Hey @patrickwasp, for function calling with LiteLLM you'll need to use "ollama_chat/" instead of "ollama/" in the model name (or command line).

So ollama/dolphincoder:8k should be ollama_chat/dolphincoder:8k

marklysze commented 1 month ago

Hey @patrickwasp, did this help solve your function calling issue?