Closed ysemenenko closed 9 months ago
Unfortunately OpenAI does not provide token usage information when streaming results. This is a limitation of OpenAI and there is unfortunately nothing I can do about it. See details in #166 and at https://community.openai.com/t/openai-api-get-usage-tokens-in-response-when-set-stream-true/. Sorry 🤷♂️
Thank you very much for response.
message = "Count to 100, with a comma between each number and no newlines. E.g., 1, 2, 3, ...";
var chat = _api.Chat.CreateConversation(); chat.Model = Model.ChatGPTTurbo; chat.RequestParameters.MaxTokens = 1000; chat.RequestParameters.Temperature = 0.7; chat.AppendUserInput(message);
await foreach (var textResult in chat.StreamResponseEnumerableFromChatbotAsync()) { MainThread.BeginInvokeOnMainThread(() => { this.mainViewModel.OutputText += textResult; }); }
var promptTokens = chat.MostRecentApiResult?.Usage?.PromptTokens ?? 0; var completionTokens = chat.MostRecentApiResult?.Usage?.CompletionTokens ?? 0;
Usage object is null and impossible to receive tokens in usage