Open Ali-Kalout opened 2 months ago
Just to expand on this a little (I work with @Ali-Kalout :) )—it's not easy to tell what the malformed JSON looks like, but it seems possible that this PR could improve the situation.
I also noticed yesterday that OpenAI has added a JSON mode to chat completions. Adding response_format
to the completion API call seems like it could help a lot here.
If I'm not mistaken, if I just update my chat_completion
params, whatever params aren't Instructor params will be passed through to the API request. Is that right? So the following should work?
Instructor.chat_completion(
model: @model,
response_model: response_model,
messages: messages,
stream: stream,
max_retries: @max_retries,
# the important part below
response_format: %{ type: "json_object"}
)
I have this PR that helps with debugging: https://github.com/thmsmlr/instructor_ex/pull/46. If you IO.inspect the request before it goes to the LLM, you should be able to see if the param is included in the way you'd expect.
Hi all, we sometimes get the following error, when using the chat completion feature while having the
stream
field as true.