openai-php / client

⚡️ OpenAI PHP is a supercharged community-maintained PHP API client that allows you to interact with OpenAI API.
MIT License
4.7k stars 483 forks source link

Get token count at end of stream #186

Closed Jafo232 closed 1 year ago

Jafo232 commented 1 year ago

Consider:

$stream = $this->client->chat()->createStreamed([
                                                      'model' => config('services.openai.model'),
                                                      'messages' => $prompts,

                                                  ]);

  foreach ($stream as $response) {

      $this->response .= $response->choices[0]->delta->content;

  }

How do I get access to the tokens used after all of this is done? The response property is private.

Jafo232 commented 1 year ago

I submitted this PR that would make this work.

https://github.com/openai-php/client/pull/187

gehrisandro commented 1 year ago

You can now use the latest release which gives you access to the header / meta information on every response: https://github.com/openai-php/client#meta-information

georgesma commented 9 months ago

You can now use the latest release which gives you access to the header / meta information on every response: https://github.com/openai-php/client#meta-information

I have the same problem. I've tried to use the meta information but it doesn't give information about usage (eg. promptTokens, completionTokens, totalTokens).

Jafo232 commented 5 months ago

But this doesn't tell you how many input/output tokens were used unless I am missing something..

georgesma commented 4 months ago

But this doesn't tell you how many input/output tokens were used unless I am missing something..

I just found out that this is due to an OpenAI API limitation, not to the PHP implementation. Source: https://cookbook.openai.com/examples/how_to_stream_completions#downsides

_Another small drawback of streaming responses is that the response no longer includes the usage field to tell you how many tokens were consumed. After receiving and combining all of the responses, you can calculate this yourself using tiktoken._