Currently if you call llm.chat with tools and streaming the tool_calls don't show up in the raw_response. I'm adding the tool_calls to the response_from_chunks so the final response when using streaming is almost the same as when not using streaming.
With this solution
response = llm.chat(messages:, tools:) { |chunk| do_something(chunk) }
response.tool_calls # will return the tool_calls and not an empty array
Currently if you call
llm.chat
with tools and streaming the tool_calls don't show up in the raw_response. I'm adding thetool_calls
to theresponse_from_chunks
so the final response when using streaming is almost the same as when not using streaming.With this solution