Closed hnvmeta closed 9 months ago
The thing is the chat responses chunk object doesn't support having the usage
like the asynchronous chat response object
Hi @anasfik I am looking for this feature too. It seems that the chat responses chunk object does include usage according to the page you linked (maybe it didn't before?), just need to include stream_options: {"include_usage": true}
. It says it will be null until the final chunk.
I will try to find where to put that in the library, but any help would be appreciated.
Update: I made a PR for this: https://github.com/anasfik/openai/pull/180
I need to read the
total_tokens
field in JSON which is returned from OpenAI. Can you add this field to theOpenAIStreamChatCompletionModel
class? Because this class is final, I can't extend it.