Be sure that your instructions include a careful description of the JSON schema that you expect the LLM to generate. And note that the API means it when it says "json_object". In other words, this doesn't constrain the response to any JSON, it specifically ensures the model will return a single JSON object. So if you, for example, want a JSON array, the array will have to be a property in a wrapper object.
Fixes https://github.com/PrefectHQ/marvin/issues/956
As the proposed example usage in that issue, this PR enables this kind of usage:
or this:
Be sure that your instructions include a careful description of the JSON schema that you expect the LLM to generate. And note that the API means it when it says "json_object". In other words, this doesn't constrain the response to any JSON, it specifically ensures the model will return a single JSON object. So if you, for example, want a JSON array, the array will have to be a property in a wrapper object.
Here's the relevant Open AI endpoint doc: https://platform.openai.com/docs/api-reference/assistants