PrefectHQ / marvin

✨ Build AI interfaces that spark joy
https://askmarvin.ai
Apache License 2.0
5.34k stars 348 forks source link

Allow passing of `response_format` to Assistant constructor #957

Closed andehr closed 4 months ago

andehr commented 4 months ago

Fixes https://github.com/PrefectHQ/marvin/issues/956

As the proposed example usage in that issue, this PR enables this kind of usage:

Assistant(instructions=some_instructions, response_format=JsonObjectAssistantResponseFormat())

or this:

Assistant(instructions=some_instructions, response_format=AssistantResponseFormat(type="json_object"))

Be sure that your instructions include a careful description of the JSON schema that you expect the LLM to generate. And note that the API means it when it says "json_object". In other words, this doesn't constrain the response to any JSON, it specifically ensures the model will return a single JSON object. So if you, for example, want a JSON array, the array will have to be a property in a wrapper object.

Here's the relevant Open AI endpoint doc: https://platform.openai.com/docs/api-reference/assistants