openai / openai-python

The official Python library for the OpenAI API
https://pypi.org/project/openai/
Apache License 2.0
23.08k stars 3.25k forks source link

json_schema structured output type not supported in gpt-4o assistants #1857

Open majr-red opened 2 weeks ago

majr-red commented 2 weeks ago

Confirm this is an issue with the Python library and not an underlying OpenAI API

Describe the bug

Not sure if this is a bug in the docs or the code. The API reference for the response_format parameter of assistants.create has:

Specifies the format that the model must output. Compatible with GPT-4o, GPT-4 Turbo, and all GPT-3.5 Turbo models since gpt-3.5-turbo-1106.

Setting to { "type": "json_schema", "json_schema": {...} } enables Structured Outputs which ensures the model will match your supplied JSON schema.

But I am seeing this error message:

"Invalid parameter: 'response_format' of type 'json_schema' is not supported with model version gpt-4o."

>>> from pydantic import BaseModel
>>> 
>>> class Project_Spending(BaseModel):
...     Authority_Name: str
...     Project_Count: int
...     Total_spending: int
... 
>>> client.beta.assistants.create(
...     name = "prjdata",
...     tools = [{'type':'file_search'}],
...     model = "gpt-4o",
...     description = "find the required data from the attachment",
...     response_format={
...         'type': 'json_schema',
...         'json_schema':
...         {
...             "name": "Project_Spending",
...             "schema": Project_Spending.model_json_schema()
...         }
...     }
... )
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/resources/beta/assistants.py", line 146, in create
    return self._post(
           ^^^^^^^^^^^
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/_base_client.py", line 1277, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/_base_client.py", line 954, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/opt/homebrew/anaconda3/envs/apt-research/lib/python3.12/site-packages/openai/_base_client.py", line 1058, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with model version `gpt-4o`.", 'type': 'invalid_request_error', 'param': 'response_format', 'code': None}}
>>> 

To Reproduce

Call OpenAI().beta.assistants.create() with a response_format parameter

Code snippets

No response

OS

macOS

Python version

Python 3.12.7

Library version

openai v1.52.2

majr-red commented 1 week ago

After some more investigation, I find I get the same error from gpt-3.5-turbo-1106.

I also find that although the models documentation says that gpt-4o points to gpt-4o-2024-08-06, if I point the assistant in the above code explicitly to gpt-4o-2024-08-06 (i.e. model="gpt-4o-2024-08-06"), I get a different error:

BadRequestError: Error code: 400 - {'error': {'message': 'Invalid tools: all tools must be of type `function` when `response_format` is of type `json_schema`.', ...

Is it possible that in the python library, gpt-4o does not actually point to gpt-4o-2024-08-06?