The "chatCompletionStreamOptions" has been added to the schema for returing the token usage into the streaming, but it is never used into the YAML. The "stream_options" parameter is missing, compared to the OpenAI specs
https://platform.openai.com/docs/api-reference/chat
Expected behavior
"stream_options" parameter should be exposed into the request body as optional parameter
Actual behavior
"stream_options" is not available and I had to have a look at the OpenAI specs to understand how to use the "include_usage" parameter.
Reproduction Steps
Create a request with Postman to the ChatCompletion.
Pass "stream" : true
Do not know which parameter name to pass for the streaming options.
API Spec link
https://github.com/Azure/azure-rest-api-specs/blob/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference/preview/2024-08-01-preview/inference.yaml
API Spec version
2024-08-01-preview
Describe the bug
The "chatCompletionStreamOptions" has been added to the schema for returing the token usage into the streaming, but it is never used into the YAML. The "stream_options" parameter is missing, compared to the OpenAI specs https://platform.openai.com/docs/api-reference/chat
Expected behavior
"stream_options" parameter should be exposed into the request body as optional parameter
Actual behavior
"stream_options" is not available and I had to have a look at the OpenAI specs to understand how to use the "include_usage" parameter.
Reproduction Steps
Create a request with Postman to the ChatCompletion. Pass "stream" : true Do not know which parameter name to pass for the streaming options.
Environment
No response