Open golbin opened 2 weeks ago
The OpenAI usage includes the following information, but only Anthropic services store cached token details:
"usage": { "prompt_tokens": 2006, "completion_tokens": 300, "total_tokens": 2306, "prompt_tokens_details": { "cached_tokens": 1920 }, "completion_tokens_details": { "reasoning_tokens": 0 } }
I noticed that it's being prepared as shown below. It might be better to transition this to the OpenAI service:
https://github.com/pipecat-ai/pipecat/blob/ca15d9738349fa99b1baf4e3a0a61dfeada01424/src/pipecat/services/openai_realtime_beta/events.py#L367
Thanks @golbin, Would you be up to making a PR with the changes you envision?
The OpenAI usage includes the following information, but only Anthropic services store cached token details:
I noticed that it's being prepared as shown below. It might be better to transition this to the OpenAI service:
https://github.com/pipecat-ai/pipecat/blob/ca15d9738349fa99b1baf4e3a0a61dfeada01424/src/pipecat/services/openai_realtime_beta/events.py#L367