Closed jinhuashijie closed 1 year ago
Streaming should use:
foreach($stream as $response){
echo $response->choices[0]->delta->content;
}
You need to ensure your php and webserver configuration allow for this.
See this thread for more information: https://github.com/openai-php/client/issues/100
when i use stream , code like down $stream = $client->completions()->createStreamed([ 'model' => 'text-davinci-003', 'prompt' => $prompt, 'max_tokens' => 1000, ]); foreach($stream as $response){ // echo $response->choices[0]->text."
"; echo $response->choices[0]->text; // var_dump($response); // echo $response.id; } i have to wait atleats 12 s why this problem? where i did error?