thmsmlr / instructor_ex

Structured outputs for LLMs in Elixir
https://hexdocs.pm/instructor
430 stars 48 forks source link

Error when using streaming chat completion #44

Open Ali-Kalout opened 2 months ago

Ali-Kalout commented 2 months ago

Hi all, we sometimes get the following error, when using the chat completion feature while having the stream field as true.

Screenshot 2024-04-09 at 3 11 04 PM

adampash commented 2 months ago

Just to expand on this a little (I work with @Ali-Kalout :) )—it's not easy to tell what the malformed JSON looks like, but it seems possible that this PR could improve the situation.

I also noticed yesterday that OpenAI has added a JSON mode to chat completions. Adding response_format to the completion API call seems like it could help a lot here.

adampash commented 2 months ago

If I'm not mistaken, if I just update my chat_completion params, whatever params aren't Instructor params will be passed through to the API request. Is that right? So the following should work?

    Instructor.chat_completion(
      model: @model,
      response_model: response_model,
      messages: messages,
      stream: stream,
      max_retries: @max_retries,
      # the important part below
      response_format: %{ type: "json_object"}
    )
petrus-jvrensburg commented 2 months ago

I have this PR that helps with debugging: https://github.com/thmsmlr/instructor_ex/pull/46. If you IO.inspect the request before it goes to the LLM, you should be able to see if the param is included in the way you'd expect.