I've attempted to capture it as specified, and verified that top_logprobs is provided whether we do or do not limit how many are returned as denoted in the test.
What problem is this fixing?
Closing the gap between what openai has exposed and what is supported.
Describe your change
Wanted to be able to get the logprobs in the response from openai. It looks like its a non beta feature exposed in the chat completion object
https://platform.openai.com/docs/api-reference/chat/object
I've attempted to capture it as specified, and verified that top_logprobs is provided whether we do or do not limit how many are returned as denoted in the test.
What problem is this fixing?
Closing the gap between what openai has exposed and what is supported.
Should be resolving: https://github.com/aallam/openai-kotlin/issues/326