Closed Clad3815 closed 3 months ago
able to repro this on a docker. not on local.
pasting the server log i'm seeing:
18:41:32 - LiteLLM:DEBUG: utils.py:242 -
POST Request Sent from LiteLLM:
curl -X POST \
https://api.openai.com/v1/ \
-H 'Authorization: Bearer sk-snlm********************************************' \
-d '{'model': 'gpt-3.5-turbo', 'messages': [{'role': 'user', 'content': "What's the weather like in Boston today?"}], 'stream': False, 'tools': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'tool_choice': 'auto', 'parallel_tool_calls': True, 'extra_body': {}}'
18:41:32 - LiteLLM:ERROR: main.py:409 - litellm.acompletion(): Exception occured - AsyncCompletions.create() got an unexpected keyword argument 'parallel_tool_calls'
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 388, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1019, in acompletion
raise e
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1002, in acompletion
headers, response = await self.make_openai_chat_completion_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 783, in make_openai_chat_completion_request
raise e
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 778, in make_openai_chat_completion_request
response = await openai_aclient.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncCompletions.create() got an unexpected keyword argument 'parallel_tool_calls'
18:41:32 - LiteLLM:DEBUG: main.py:414 - Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 388, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1019, in acompletion
raise e
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 1002, in acompletion
headers, response = await self.make_openai_chat_completion_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 783, in make_openai_chat_completion_request
raise e
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 778, in make_openai_chat_completion_request
response = await openai_aclient.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncCompletions.create() got an unexpected keyword argument 'parallel_tool_calls'
this might be related to the openai package version
confirmed - this is fixed by just bumping the openai version
Fixed with https://github.com/BerriAI/litellm/commit/d5d782f84454c8c61b5ef34e31d671b1f780c9bc
live in the next release - v1.41.15+
@Clad3815
How're you using the proxy today? @Clad3815
What happened?
Reopen of #4619 look at my curl request
Using Docker and the config.yaml
Using version
FROM ghcr.io/berriai/litellm:main-v1.41.14
Relevant log output
No response
Twitter / LinkedIn details
No response