openai-php / client

⚡️ OpenAI PHP is a supercharged community-maintained PHP API client that allows you to interact with OpenAI API.
MIT License
4.7k stars 483 forks source link

Missing content if it consist only from zero #169

Closed websolutionfalcon closed 1 year ago

websolutionfalcon commented 1 year ago

Hello! We used the library to get streamed responses from Openai. And we found a problem that sometimes streamed chunks do not have content.

I ask model: "multiply 100 on 10" And responses is:

  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"role":"assistant"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":"To"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" multiply"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" "},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":"100"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" by"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" "},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":"10"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":","},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" you"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" simply"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" multiply"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" the"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" two"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" numbers"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" together"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":":

"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":"100"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" *"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" "},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":"10"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" ="},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":" "},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"content":"100"},"finish_reason":null}]} 
  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":[],"finish_reason":null}]} 
[2023-07-19 10:14:08] local.INFO: streaming:  {"id":"chatcmpl-7dyU298SlZywQtM5bhjXMlPJvd3Bx","object":"chat.completion.chunk","created":1689761646,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":[],"finish_reason":"stop"}]} 

As you can see the last chunk before finish should contain only on symbol 0. I guess the problem is that PHP evaluates it as false that's why we have missing content. Is last chunk consists of 00 or 000 everything is working fine

My code:

$client = OpenAI::client(config('openapi.key'));

            return response()->stream(function () use ($client, $request, $messages) {
                $stream = $client->chat()->createStreamed([
                    'model' => $request->input('model', 'gpt-3.5-turbo'),
                    'temperature' => 0.7,
                    'max_tokens' => $request->input('max_tokens', 512),
                    'top_p' => 1,
                    'messages' => $messages,
                    'stop' => $request->input('stop', null),
                    'n' => 1,
                    'user' => $request->input('user', ''),
                ]);

                foreach ($stream as $response) {
                    $text = json_encode($response->toArray());
                    \Log::info('streaming: ',  $response->toArray());
                    if (connection_aborted()) {
                        break;
                    }

                    echo "event: update\n";
                    echo 'data: ' . $text;
                    echo "\n\n";
                    ob_flush();
                    flush();
                }

                echo "event: update\n";
                echo 'data: <END_STREAMING_SSE>';
                echo "\n\n";
                ob_flush();
                flush();
            }, 200, [
                'Cache-Control' => 'no-cache',
                'X-Accel-Buffering' => 'no',
                'Content-Type' => 'text/event-stream',
            ]);
pb30 commented 1 year ago

Are you on the latest version (0.6.3)? I believe this was fixed

websolutionfalcon commented 1 year ago

I was on 0.4.1. Thanks, update fix it.