Open pors opened 11 months ago
This approach requires adding tiktoken
as a dependency, which I would prefer to avoid but I might not have a choice.
It also probably doesn't work at all with structured input data (as OpenAI uses fancy things for that) so it adds a new discrepancy problem.
This approach requires adding
tiktoken
as a dependency, which I would prefer to avoid but I might not have a choice.
Of course, app developers can calculate it themselves like this. So just providing documentation on how to do it might be enough?
It also probably doesn't work at all with structured input data (as OpenAI uses fancy things for that) so it adds a new discrepancy problem.
What do you mean here? The use of OpenAI functions?
I think a simple solution could be to wait until the last chunk arrived and then calculate the usage with the prompt and response strings as shown here: https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb
Needs to be extended to include the prompt to get the full usage object.