BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.41k stars 1.57k forks source link

[Bug]: `parallel_tool_calls` with OpenAI #4639

Closed Clad3815 closed 3 months ago

Clad3815 commented 3 months ago

What happened?

Reopen of #4619 look at my curl request

(base) ➜  clad3815 git:(master) ✗ curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer sk-1234' \
--data '{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "What'\''s the weather like in Boston today?"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "The city and state, e.g. San Francisco, CA"
            },
            "unit": {
              "type": "string",
              "enum": ["celsius", "fahrenheit"]
            }
          },
          "required": ["location"]
        }
      }
    }
  ],
  "tool_choice": "auto",
  "parallel_tool_calls": true
  }'
{"error":{"message":"litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncCompletions.create() got an unexpected keyword argument 'parallel_tool_calls'","type":null,"param":null,"code":500}}%

Using Docker and the config.yaml

model_list:
# OpenAI Models
  - model_name: gpt-3.5-turbo
    litellm_params:
      model: openai/gpt-3.5-turbo
  - model_name: gpt-4o
    litellm_params:
      model: openai/gpt-4o
  - model_name: gpt-4-turbo
    litellm_params:
      model: openai/gpt-4-turbo
  - model_name: tts-1
    litellm_params:
      model: openai/tts-1
  - model_name: tts-1-hd
    litellm_params:
      model: openai/tts-1-hd
  - model_name: dall-e-2
    litellm_params:
      model: openai/dall-e-2
    model_info:
      mode: image_generation
  - model_name: dall-e-3
    litellm_params:
      model: openai/dall-e-3
    model_info:
      mode: image_generation
  - model_name: text-moderation-stable
    litellm_params:
      model: openai/text-moderation-stable
  - model_name: text-moderation-latest
    litellm_params:
      model: openai/text-moderation-latest
  - model_name: whisper-1
    litellm_params:
      model: openai/whisper-1
    model_info:
      mode: audio_transcription

Using version FROM ghcr.io/berriai/litellm:main-v1.41.14

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 3 months ago

able to repro this on a docker. not on local.

krrishdholakia commented 3 months ago

pasting the server log i'm seeing:

18:41:32 - LiteLLM:DEBUG: utils.py:242 -

POST Request Sent from LiteLLM:
curl -X POST \
https://api.openai.com/v1/ \
-H 'Authorization: Bearer sk-snlm********************************************' \
-d '{'model': 'gpt-3.5-turbo', 'messages': [{'role': 'user', 'content': "What's the weather like in Boston today?"}], 'stream': False, 'tools': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'tool_choice': 'auto', 'parallel_tool_calls': True, 'extra_body': {}}'

18:41:32 - LiteLLM:ERROR: main.py:409 - litellm.acompletion(): Exception occured - AsyncCompletions.create() got an unexpected keyword argument 'parallel_tool_calls'
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 388, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1019, in acompletion
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1002, in acompletion
    headers, response = await self.make_openai_chat_completion_request(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 783, in make_openai_chat_completion_request
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 778, in make_openai_chat_completion_request
    response = await openai_aclient.chat.completions.create(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncCompletions.create() got an unexpected keyword argument 'parallel_tool_calls'

18:41:32 - LiteLLM:DEBUG: main.py:414 - Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 388, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1019, in acompletion
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1002, in acompletion
    headers, response = await self.make_openai_chat_completion_request(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 783, in make_openai_chat_completion_request
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 778, in make_openai_chat_completion_request
    response = await openai_aclient.chat.completions.create(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncCompletions.create() got an unexpected keyword argument 'parallel_tool_calls'
krrishdholakia commented 3 months ago

this might be related to the openai package version

krrishdholakia commented 3 months ago

confirmed - this is fixed by just bumping the openai version

krrishdholakia commented 3 months ago

Fixed with https://github.com/BerriAI/litellm/commit/d5d782f84454c8c61b5ef34e31d671b1f780c9bc

Screenshot 2024-07-10 at 11 53 06 AM

live in the next release - v1.41.15+ @Clad3815


How're you using the proxy today? @Clad3815