Closed ReSpawN closed 1 week ago
\OpenAI\Responses\Chat\CreateResponseUsage doesn't honor the Prompt Caching return which causes confusion that ChatGPT isn't 'activating' Prompt Caching (perhaps due to a unsupportive model).
\OpenAI\Responses\Chat\CreateResponseUsage
{"prompt_tokens":1669,"completion_tokens":526,"total_tokens":2195,"prompt_tokens_details":{"cached_tokens":1152},"completion_tokens_details":{"reasoning_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}}
Simply produce a request to chat/completions (chat()->create([])).
chat()->create([])
v0.10.2
8.3.10
No response
Implemented in https://github.com/openai-php/client/pull/494
Description
\OpenAI\Responses\Chat\CreateResponseUsage
doesn't honor the Prompt Caching return which causes confusion that ChatGPT isn't 'activating' Prompt Caching (perhaps due to a unsupportive model).Steps To Reproduce
Simply produce a request to chat/completions (
chat()->create([])
).OpenAI PHP Client Version
v0.10.2
PHP Version
8.3.10
Notes
No response