jxnl / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
7.56k stars 602 forks source link

Integrate with new OpenAI json_schema params #909

Open asim-shrestha opened 1 month ago

asim-shrestha commented 1 month ago

Is your feature request related to a problem? Please describe. OpenAI just released more strict structured output adherence: https://openai.com/index/introducing-structured-outputs-in-the-api/.

Taking a look at the instructor client code, it seems we're already passing in strict=True by default but it would be good to also support the new json_schema field for the gpt-4o models that support it

Relevant quote describing json_schema from the blogpost

  1. A new option for the response_format parameter: developers can now supply a JSON Schema via json_schema, a new option for the response_format parameter. This is useful when the model is not calling a tool, but rather, responding to the user in a structured way. This feature works with our newest gpt-4o models: gpt-4o-2024-08-06, released today, and gpt-4o-mini-2024-07-18. When a response_format is supplied with strict: true, model outputs will match the supplied schema.
AlmogBaku commented 1 month ago

That was fast!

it looks like oai took much inspiration from here 😬

mrdkucher commented 1 week ago

+1. Looks like support was added for adding the strict=True argument for tool calls (https://github.com/jxnl/instructor/pull/938), but will there be a corresponding Mode.JSON_STRICT update using the new "json_schema" support in OpenAI's response_model parameter? It would be nice to get guaranteed structured output back as well, besides just in tool calls.