openai / openai-python

The official Python library for the OpenAI API
https://pypi.org/project/openai/
Apache License 2.0
22.31k stars 3.1k forks source link

The official example for Function Calling doesn't work with SDK version 1.1.1 #703

Closed mikulskibartosz closed 11 months ago

mikulskibartosz commented 11 months ago

Expected behavior

The "Example with one function called in parallel" code from the documentation should correctly show how to use the function call feature.

Actual behavior

I get BadRequestError: Error code: 400 - {'error': {'message': "'content' is a required property - 'messages.1'", 'type': 'invalid_request_error', 'param': None, 'code': None}} while running the code.

Stack trace:

BadRequestError                           Traceback (most recent call last)
[<ipython-input-73-3a60881757d5>](https://localhost:8080/#) in <cell line: 77>()
     75         )  # get a new response from the model where it can see the function response
     76         return second_response
---> 77 print(run_conversation())

5 frames
[<ipython-input-73-3a60881757d5>](https://localhost:8080/#) in run_conversation()
     70                 }
     71             )  # extend conversation with function response
---> 72         second_response = openai.chat.completions.create(
     73             model="gpt-3.5-turbo-1106",
     74             messages=messages,

[/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py](https://localhost:8080/#) in wrapper(*args, **kwargs)
    297                         msg = f"Missing required argument: {quote(missing[0])}"
    298                 raise TypeError(msg)
--> 299             return func(*args, **kwargs)
    300 
    301         return wrapper  # type: ignore

[/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py](https://localhost:8080/#) in create(self, messages, model, frequency_penalty, function_call, functions, logit_bias, max_tokens, n, presence_penalty, response_format, seed, stop, stream, temperature, tool_choice, tools, top_p, user, extra_headers, extra_query, extra_body, timeout)
    554         timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
    555     ) -> ChatCompletion | Stream[ChatCompletionChunk]:
--> 556         return self._post(
    557             "/chat/completions",
    558             body=maybe_transform(

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in post(self, path, cast_to, body, options, files, stream, stream_cls)
   1053             method="post", url=path, json_data=body, files=to_httpx_files(files), **options
   1054         )
-> 1055         return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
   1056 
   1057     def patch(

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in request(self, cast_to, options, remaining_retries, stream, stream_cls)
    832         stream_cls: type[_StreamT] | None = None,
    833     ) -> ResponseT | _StreamT:
--> 834         return self._request(
    835             cast_to=cast_to,
    836             options=options,

[/usr/local/lib/python3.10/dist-packages/openai/_base_client.py](https://localhost:8080/#) in _request(self, cast_to, options, remaining_retries, stream, stream_cls)
    875             # to completion before attempting to access the response text.
    876             err.response.read()
--> 877             raise self._make_status_error_from_response(err.response) from None
    878         except httpx.TimeoutException as err:
    879             if retries > 0:

Versions

OpenAI SDK: 1.1.1 Python: 3.10.12

RobertCraigie commented 11 months ago

Have you verified that you've received access to the new API yet? I ran this example earlier and it worked as expected for me.

unconv commented 11 months ago

I'm experiencing the same issue with OpenAI 1.1.1 and Python 3.10.12.

Workaround is to change the None values in the response_message to acceptable ones:

if response_message.content is None:
    response_message.content = ""
if response_message.function_call is None:
    del response_message.function_call

Add this after the line response_message = response.choices[0].message (line 44)

EDIT: The proper fix is to upgrade pydantic with pip install --upgrade pydantic

mikulskibartosz commented 11 months ago

@RobertCraigie, I think it's unrelated to having access to the new API or the model because it fails during the second call, line: second_response = openai.chat.completions.create(. The first one worked fine, and I received the function name with its arguments.

However, if it's an API access issue, the error message is misleading.

mikulskibartosz commented 11 months ago

@unconv del response_message.function_call removes the function with its arguments from the conversation, so it won't be available to the AI model while it generates subsequent messages. It may affect the results because the model can access the function's response (that you provide) but doesn't know the query anymore (it's removed from the chat).

unconv commented 11 months ago

@mikulskibartosz It won't affect the results since it is deleted only if it is None (see the if statement it's wrapped in). The problem is the library is sending content and function_call as None but the API doesn't allow that. Setting content to an empty string in case it is None and removing function_call altogether when it is None solves the issue.

EDIT: No need to remove function_call - I only needed it at first because I used the following:

response_message = dict(response.choices[0].message)
if response_message["content"] is None:
    response_message["content"] = ""

Without casting to a dict it is sufficient to set the content to an empty string if it is None

bgonzalezfractal commented 11 months ago

@unconv patch is working, although I do needed to remove the function call as it was None as well, tried replacing it what "" but did not work, in order to make the example work:

response_message = response.choices[0].message
tool_calls = response_message.tool_calls
response_message = dict(response.choices[0].message)
if response_message["content"] is None:
    response_message["content"] = ""
if response_message["function_call"] is None:
    del response_message["function_call"]
cjpark-data commented 11 months ago

@bgonzalezfractal Thank you. I solve this problem with your solution.

miwiley commented 11 months ago

@unconv - thank you! was getting stumped on options

Code commented - func = search.api


# Search API Function - duckpy
def search_api(input):
    #print ("duck-duck-go is searching for:", input)
    content = duckduckgo_client.search(input)
    #print ('duck-duck-go', content)
    return str(content)

# Functions Dict
func_dict = { 
    "search_api": search_api,  # Search Function
}

# OpenAI - Check/Run Functions (tools)
# User Input from Front-End
def run_conversation(user_input): # OpenAI - Tools/Functions
    #print ("incoming_msg", user_input)
    messages = [{"role": "user", "content": user_input}]
    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_online_data",
                "description": "Get real-time data from the web",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "input": {
                            "type": "string",
                            "description": "What is the latest stock price of Apple?",
                        },
                    },
                    "required": ["input"],
                },
            },
        }
    ]

    response = openai.chat.completions.create(
        model="gpt-3.5-turbo-1106",
        messages=messages,
        tools=tools,
        tool_choice="auto",
    )
    response_message = response.choices[0].message
    #print ("1st response_message", response_message)
    total_tokens = response.usage.total_tokens

    # If there are no tool calls, concatenate the content and total tokens, AI response is complete
    if response.choices[0].message.tool_calls is None:
        # Format the output as desired, here it's concatenated as a string
        return response_message.content + " Total tokens used: " + str(total_tokens)

    tool_calls = response_message.tool_calls
    response_message = dict(response.choices[0].message) # convert to dict to make it mutable
    if response_message["function_call"] is None: # if there are no function calls, remove the key
        del response_message["function_call"] # remove the key
        # print ("response_message after key removal", response_message)
        # response_message after key removal {'content': None, 'role': 'assistant', 'tool_calls': [ChatCompletionMessageToolCall(id='call_HWJSKeYTBb8TEbYe96RhZOgU', function=Function(arguments='{"input":"latest news on maui"}', name='get_online_data'), type='function')]}

    # Step 2: check if the model wanted to call a function
    if tool_calls: # if there are function calls

        # Step 3: call the function
        available_functions = {
            "get_online_data": search_api,
        }  # only one function in this example, but you can have multiple
        messages.append(response_message)  # extend conversation with assistant's reply

        # Step 4: send the info for each function call and function response to the model
        for tool_call in tool_calls:
            function_name = tool_call.function.name # 'tool_calls' .function.name='get_online_data'
            function_to_call = available_functions[function_name] # 'available_functions' = "get_online_data" [could be more than one function]
            function_args = json.loads(tool_call.function.arguments) # 'tool_call' .function.arguments='{"input":"latest news on israel"}'
            function_response = function_to_call(**function_args) # now call the function(s) with the arguments
            messages.append(
                {
                    "tool_call_id": tool_call.id,
                    "role": "tool",
                    "name": function_name,
                    "content": function_response,
                }
            )
        second_response = openai.chat.completions.create(
            model="gpt-3.5-turbo-1106",
            messages=messages,
        )
        print ("second_response", second_response)
        # second_response ChatCompletion(id='chatcmpl-8IzEv0nQ0CBj8hkYZEd60GgiAT3m3', choices=[Choice(finish_reason='stop', index=0, message=ChatCompletionMessage(content='The latest news in Dallas includes breaking stories and investigative journalism on various topics such as crime, education, environment, healthcare, politics, and sports. There are also updates on local news, weather, safety, and events in the Dallas area. Additionally, there are news reports about the Dallas Cowboys and other local sports teams. To read more about the latest news in Dallas, you can visit websites such as Dallas News, Dallas Observer, NBC 5 Dallas-Fort Worth, FOX 4 News Dallas-Fort Worth, and others.', role='assistant', function_call=None, tool_calls=None))], created=1699536001, model='gpt-3.5-turbo-1106', object='chat.completion', system_fingerprint='fp_eeff13170a', usage=CompletionUsage(completion_tokens=106, prompt_tokens=2464, total_tokens=2570))
        return second_response

    # Optional: second response from the model
    # formatted_answer = second_response.choices[0].message.content + " Total tokens used: " + str(answer.usage.total_tokens)```
RobertCraigie commented 11 months ago

Is this report for the example defined here? https://platform.openai.com/docs/guides/function-calling/parallel-function-calling

Or is it somewhere else? That example runs successfully without modification for me.

mikulskibartosz commented 11 months ago

Yes. It was this example

enochcheung commented 11 months ago

@mikulskibartosz @unconv @cjpark-data @miwiley

Hi!

We're looking into this issue, but have not been able to reproduce this on our end. From your reports, we suspect that this is due to the first API request missing content in its response message, which would be a bug. This causes the second API request to fail since you are just adding the message from the API response of the first request. However, we have not been able to observe this behavior in our API either. Given that we are using the exact same snippet, this is very odd.

Is this something that occurs consistently on your end? If possible, we'd love to see the full output of the FIRST request as well. We appreciate your help in helping us figure this out!

unconv commented 11 months ago

@enochcheung @mikulskibartosz @cjpark-data @miwiley

I wasn't able to reproduce the problem when running in a virtual environment, so I checked the difference between pip freeze in the in the virtual environment and without it.

Turns out pydantic==1.10.12 was the issue.

Running pip install --upgrade pydantic solves the problem.

enochcheung commented 11 months ago

Interesting! Thank you for digging into this, this is super useful for us. We'll look into ways to try to be more resilient here.

miwiley commented 11 months ago

The fail , success, and upgrade of pydantic @unconv @enochcheung

../env/lib/python3.9/site-packages

Name: openai Version: 1.2.0

Name: anyio Version: 3.7.1

Name: distro Version: 1.8.0

Name: httpx Version: 0.25.0

Name: pydantic Version: 1.10.12 <---BEFORE UPGRADE

Name: tqdm Version: 4.66.1

Name: typing_extensions Version: 4.8.0

CODE & LOGS

import openai from openai import OpenAI import logging import json from duckpy import Client import re from bs4 import BeautifulSoup


FAIL =

    tool_calls = response_message.tool_calls
    print ("Tools being called", response_message.tool_calls)
    response_message = response.choices[0].message <---------

SUCCESS =

    response_message = dict(response.choices[0].message)    <--- convert to dict
    if response_message["function_call"]    <---check
        del response_message["function_call"]    <--del

FAILURE [LOGS]

ask_gpt called latest news in maui <--my route/module call User Input: latest news in maui <--incoming Initial Messages: [{'role': 'user', 'content': 'latest news in maui'}]

[FIRST RESPONSE] Initial API response ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_UW87QII3EMQZMSwY26pQvK8L', function=Function(arguments='{"input":"latest news in Maui"}', name='get_online_data'), type='function')]) Total tokens used 103

[FUNCTION CALL] Tools being called [ChatCompletionMessageToolCall(id='call_UW87QII3EMQZMSwY26pQvK8L', function=Function(arguments='{"input":"latest news in Maui"}', name='get_online_data'), type='function')]

[ERROR] Error code: 400 - {'error': {'message': "'content' is a required property - 'messages.1'", 'type': 'invalid_request_error', 'param': None, 'code': None}}

SUCCESS [LOGS]

ask_gpt called latest news in maui <--my route/module call User Input: latest news in maui <--incoming Initial Messages: [{'role': 'user', 'content': 'latest news in maui'}]

[FIRST RESPONSE] Initial API response ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_PspTCyFADZx5Iiw3Qvr2Nxwo', function=Function(arguments='{"input":"latest news in Maui"}', name='get_online_data'), type='function')]) Total tokens used 103

[FUNCTION CALL] Tools being called [ChatCompletionMessageToolCall(id='call_PspTCyFADZx5Iiw3Qvr2Nxwo', function=Function(arguments='{"input":"latest news in Maui"}', name='get_online_data'), type='function')]

[SECOND RESPONSE] second_response ChatCompletion(id='chatcmpl-8JI6I5zb7GHBjnQzjYPWb7j7MjKcI', choices=[Choice(finish_reason='stop', index=0, message=ChatCompletionMessage(content="The latest news in Maui includes updates on the recovery efforts following the devastating wildfires, the announcement of a $150 million fund to help wildfire victims' families and survivors, and ongoing efforts to support affected residents. There are also reports on the impact of the wildfires on the community and the progress in the search for missing individuals. Additionally, there are updates on various relief efforts and initiatives aimed at supporting those affected by the wildfires.", role='assistant', function_call=None, tool_calls=None))], created=1699608502, model='gpt-3.5-turbo-1106', object='chat.completion', system_fingerprint='fp_eeff13170a', usage=CompletionUsage(completion_tokens=84, prompt_tokens=2884, total_tokens=2968))

WORKING AFTER PYDANTIC UPGRADE

pip3 install --upgrade pydantic --- Successfully installed pydantic-2.4.2 pydantic-core-2.10.1 --- removed dict,del (reverted to example method) response_message = response.choices[0].message <--- tool_calls = response_message.tool_calls

[LOGS] ask_gpt called latest news in maui User Input: latest news in maui Initial Messages: [{'role': 'user', 'content': 'latest news in maui'}]

[FIRST RESPONSE]

Initial API response ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_oC6CEoCRsAzaqroMgD8cLIQl', function=Function(arguments='{"input":"latest news in Maui"}', name='get_online_data'), type='function')]) Total tokens used 103

[FUNCTION CALL]

Tools being called [ChatCompletionMessageToolCall(id='call_oC6CEoCRsAzaqroMgD8cLIQl', function=Function(arguments='{"input":"latest news in Maui"}', name='get_online_data'), type='function')]

[SECOND RESPONSE]

second_response ChatCompletion(id='chatcmpl-8JIPORfrJXSmH3oHfTmFTasCF6AP4', choices=[Choice(finish_reason='stop', index=0, message=ChatCompletionMessage(content='Some of the latest news in Maui include updates on the recovery efforts following the wildfires, the creation of a fund to help wildfire victims and survivors, contracts awarded for temporary elementary schools, and ongoing updates about the impact and aftermath of the wildfires. For more detailed information, you can visit websites such as Maui Now, KHON2, and the Maui News.', role='assistant', function_call=None, tool_calls=None))], created=1699609686, model='gpt-3.5-turbo-1106', object='chat.completion', system_fingerprint='fp_eeff13170a', usage=CompletionUsage(completion_tokens=71, prompt_tokens=2908, total_tokens=2979))

RobertCraigie commented 11 months ago

Thanks for the details and figuring out the cause was Pydantic v1! I've identified the difference in behaviour internally and should have a fix out later today.

RobertCraigie commented 11 months ago

This will be fixed in the next release! https://github.com/openai/openai-python/pull/776/files

roger-zr-wang commented 11 months ago

@enochcheung @mikulskibartosz @cjpark-data @miwiley

I wasn't able to reproduce the problem when running in a virtual environment, so I checked the difference between pip freeze in the in the virtual environment and without it.

Turns out pydantic==1.10.12 was the issue.

Running pip install --upgrade pydantic solves the problem.

This is exactly the solution for me! 🚀 🔥

rattrayalex commented 11 months ago

Fixed in https://github.com/openai/openai-python/releases/tag/v1.2.3

(upgrading pydantic also works)

miwiley commented 11 months ago

Fixed in https://github.com/openai/openai-python/releases/tag/v1.2.3

(upgrading pydantic also works)

Outstanding! Cheers.