Closed MadBomber closed 4 months ago
I was able to get the request to complete the round trip by adding the word "JSON" in from of the word "assistant" in the content component of the messages object just like the error response suggested.
I think your intent with the format:
parameter was to look up the value to get a text string to insert somewhere within the prompt text. It does not look like that is happening.
@MadBomber thanks for the submission. The documentation for the format
argument includes a note on JSON usage:
https://github.com/ksylvest/omniai-openai?tab=readme-ov-file#format
Specifically, it contains:
completion = client.chat([
{ role: OmniAI::Chat::Role::SYSTEM, content: OmniAI::Chat::JSON_PROMPT },
{ role: OmniAI::Chat::Role::USER, content: 'What is the name of the drummer for the Beatles?' }
], format: :json)
JSON.parse(completion.choice.message.content) # { "name": "Ringo" }
When using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message.
Anytime generating JSON I'd either suggest using the built in constant prompt or alternatively using a prompt that explicitly calls out JSON (sounds like you wound up using that version). That constant simply includes the following text as a system message:
"Respond with valid JSON. Do not include any non-JSON in the response."
To be clear, the intent of format
isn't to do a lookup. It instead passes through the response_format
to be json_object
. If you feel like the documentation for any of this can be cleaned up happy to accept a PR, but going to close for now as sounds like the behaviour is as expected fi the example didn't follow the inclusion of the JSON_PROMPT
.
I got the example message layout from the README file of the
omniai
gem.Running Ruby v3.4.0-preview1 on MacOS. omniai (1.3.1) omniai-anthropic (1.3.0) omniai-google (1.3.0) omniai-mistral (1.3.0) omniai-openai (1.3.3)