OvidijusParsiunas / deep-chat

Fully customizable AI chatbot component for your website
https://deepchat.dev
MIT License
1.43k stars 218 forks source link

Issue with tool_calls in OpenAI Chat when stream is enabled #116

Closed devpulse01 closed 7 months ago

devpulse01 commented 7 months ago

Hi Ovidijus,

I've encountered an issue when working with tool_calls in OpenAI Chat with streaming enabled.

Given that tool_calls arguments are composed in message chunks, I do not immediately return a response in the onResponse interceptor. Instead, I accumulate the tool call arguments and return only a complete message.

Example:

{
    "id": "chatcmpl-1111111111111111111111",
    "object": "chat.completion",
    "created": 1707220633,
    "model": "gpt-4-0125-preview",
    "system_fingerprint": "fp_123456789",
    "choices": [
        {
            "index": 0,
            "logprobs": null,
            "finish_reason": "tool_calls",
            "message": {
                "role": "assistant",
                "content": null,
                "tool_calls": [
                    {
                        "id": "call_11111111111111111111111",
                        "type": "function",
                        "function": {
                            "name": "get_current_weather",
                            "arguments": "{\"location\":\"New York, France\",\"unit\":\"celsius\"}"
                        }
                    }
                ]
            }
        }
    ]
}

However, I'm facing an error from services/openAI/openAIChatIO.ts within the handleTools method:

private async handleTools(message: OpenAIMessage, fetchFunc?: FetchFunc, prevBody?: OpenAIChat): Promise<ResponseT> {
  if (!message.tool_calls || !fetchFunc || !prevBody || !this._functionHandler) {
    throw Error(
      'Please define the `function_handler` property inside' +
      ' the [openAI](https://deepchat.dev/docs/directConnection/openAI#Chat) object.'
    );
  }
}

In my case, fetchFunc and prevBody are empty.

When I return the responses without any modification (allowing chunked tool call data to pass through), no error is raised, but unfortunately, the tool calls are not executed as expected.

Is there a recommended workaround or solution for handling streamed tool_calls in such a way? Thank you.

OvidijusParsiunas commented 7 months ago

Hi @devpulse01.

There was a bug in the code when calling out tools as OpenAI was returning the details of the target tool/function via multiple events and Deep Chat was instead expecting it in a single event, hence it did not know what function to call and the stream pretty much could not process anything.

I have now made a fix for this. To note, Deep Chat now aggregates the tool/function details and calls the OpenAI API correctly to get the final result, however - I could not get the full result to be streamed as the existing codebase is not set up to facilitate stream within stream functionality, hence instead it makes a normal http request and simulates the full response to be stream-like. Average users should not notice a difference, but I wanted to be clear on what the fix was. Other non-tool chat responses will still be streamed.

This fix is available in deep-chat-dev and deep-chat-react-dev packages version 9.0.129. Let me know if this works for you.

devpulse01 commented 7 months ago

Yes, it works, thank you!

OvidijusParsiunas commented 7 months ago

This has now been released in Deep Chat version 1.4.11. Checkout the release notes for more.