Closed oppsDayly closed 1 year ago
Hi
There is currently no stream support but I am right now working on it and let you know as soon it is available to test.
Thank you so much for adding this. Much cleaner than what I had before, BUT - how do you stream back the response? Is it the same server-sent events idea with echo, etc. - or is there a cleaner way through Laravel?
The reason I ask is because it's not streaming for me ... it kind of waits until its almost done, then dumps everything out. Locally, it works great, but once I push it up to production (Vapor), it doesn't behave the same.
@jhull Here is an example:
$client = OpenAI::client('sk-**********');
$stream = $client->completions()->createStreamed([
'model' => 'text-davinci-003',
'prompt' => 'PHP is ',
'max_tokens' => 1000,
]);
return response()->stream(
function () use ($stream) {
foreach($stream as $response){
echo $response->choices[0]->text;
ob_flush();
flush();
}
},
200,
[
'X-Accel-Buffering' => 'no',
]
);
Thank you so much! This is exactly what I already have in there, so I'm glad I'm on the right track.
Figured out my biggest issues was that I was running on Vapor, where the request has to totally complete before responding, which makes it impossible to stream responses (confirmed through Vapor support). So currently moving everything back to Forge. 😆
Hope this helps out others who running into the same problem!
@jhull Thanks for the update.
How can i do for support stream in laravel? i try this but maybe something went wrong