Closed devpulse01 closed 7 months ago
Hi @devpulse01.
There was a bug in the code when calling out tools as OpenAI was returning the details of the target tool/function via multiple events and Deep Chat was instead expecting it in a single event, hence it did not know what function to call and the stream pretty much could not process anything.
I have now made a fix for this. To note, Deep Chat now aggregates the tool/function details and calls the OpenAI API correctly to get the final result, however - I could not get the full result to be streamed as the existing codebase is not set up to facilitate stream within stream functionality, hence instead it makes a normal http request and simulates the full response to be stream-like. Average users should not notice a difference, but I wanted to be clear on what the fix was. Other non-tool chat responses will still be streamed.
This fix is available in deep-chat-dev
and deep-chat-react-dev
packages version 9.0.129
. Let me know if this works for you.
Yes, it works, thank you!
This has now been released in Deep Chat version 1.4.11
. Checkout the release notes for more.
Hi Ovidijus,
I've encountered an issue when working with tool_calls in OpenAI Chat with streaming enabled.
Given that tool_calls arguments are composed in message chunks, I do not immediately return a response in the onResponse interceptor. Instead, I accumulate the tool call arguments and return only a complete message.
Example:
However, I'm facing an error from services/openAI/openAIChatIO.ts within the handleTools method:
In my case, fetchFunc and prevBody are empty.
When I return the responses without any modification (allowing chunked tool call data to pass through), no error is raised, but unfortunately, the tool calls are not executed as expected.
Is there a recommended workaround or solution for handling streamed tool_calls in such a way? Thank you.