Closed DC-Sebastian closed 6 months ago
bump that. the createStreamed accepts
'stream_options' =>
['include_usage' => true]
however the last chunk, albeit with empty choices does not yield 'usage' attribute
The api is returning the usage in the last chunk, but various "Streamed" classes in the client are not picking it up from there. I can get it to work for my purposes, but not sure when I will be able to prepare a proper PR.
The api is returning the usage in the last chunk, but various "Streamed" classes in the client are not picking it up from there. I can get it to work for my purposes, but not sure when I will be able to prepare a proper PR.
Hi @svemir, Could you give an example of how to get this information? Thanks
When requesting the stream from the client, also add 'stream_options' => ['include_usage' => true]
to your request. The api will return usage information in the last chunk at the end of the stream. Unfortunately, php client is not setup to parse that data right now for chat. There are examples in other classes that do streaming in this same repository, but I did not have the time to make all those changes and update the comments and the tests, and probably will not be able to do that any time soon.
Hi @muriloeliaseinov - I responded above. I accidentally used a test GitHub account, but it was really me!
It seems that this pull request might be handling it, even though it does not seem to do all the comments and tests updates I thought would be necessary: https://github.com/openai-php/client/pull/398
Cool, I'll call @punyflash for the conversation. I discovered the project recently and am trying to use it in a other project. I hope to contribute more in the future when I learn more about its architecture.
Yeah, issue related to my PR. Do not really have time to write tests for it, was in hurry to release it in my project when GPT-4o was announced as tiktoken
was not able to calculate tokens properly anymore. Just made PR, cos why not. Works good for me:
public function completeOpenai(Thread $thread)
{
$response = OpenAI::chat()->createStreamed([
'model' => $thread->bot->model,
'stream_options' => ['include_usage' => true],
'messages' => $this->messages($thread)->map(fn (Message $message) => [
'role' => $message->role,
'content' => $message->content,
]),
]);
foreach ($response as $chunk) {
if (! empty($chunk->choices)) {
yield $chunk->choices[0]->delta?->content ?? '';
}
}
return [$chunk?->usage->promptTokens ?? 0, $chunk?->usage->completionTokens ?? 0];
}
May try it out from source:
{
"repositories": [
{
"type": "vcs",
"url": "git@github.com:punyflash/openai-php-client.git"
}
],
"require": {
"openai-php/client": "dev-main as 0.9.0"
}
}
punyflash solution works for me, is there a chance that his solution will be integrated in the near future?
punyflash solution works for me, is there a chance that his solution will be integrated in the near future?
That's up to @nunomaduro and @gehrisandro
Implemented in https://github.com/openai-php/client/pull/398
Thx to @punyflash
Usage stats are now available when using streaming in the Chat Completions API. Set
stream_options: {"include_usage": true}
and you’ll see an extra chunk at the end of the stream with usage populated. https://cookbook.openai.com/examples/how_to_stream_completions#4-how-to-get-token-usage-data-for-streamed-chat-completion-responseMaybe you can integrate this into the library so that you can also use the data for streams.