Open jschuur opened 1 year ago
Streamed responses don't include usage info in the response. Would have to calculate this via [tiktoken])(https://github.com/dqbd/tiktoken) e.g., but that would be a big dependency for the project.
Alternatively, count chunks manually for rough estimate (more info).
More lightweight, JS solutions:
https://www.npmjs.com/package/gpt-tokens https://www.npmjs.com/package/gpt-tokenizer
Usage stats for streaming now supported (at least on completion):
https://cookbook.openai.com/examples/how_to_stream_completions#4-how-to-get-token-usage-data-for-streamed-chat-completion-response
Streamed responses don't include usage info in the response. Would have to calculate this via [tiktoken])(https://github.com/dqbd/tiktoken) e.g., but that would be a big dependency for the project.
Alternatively, count chunks manually for rough estimate (more info).