Open sallahbaksh opened 1 month ago
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @glharper.
Hi @sallahbaksh Thanks for opening this report! In order to help diagnose the issue, could you please enable logging and share the logs? you can do so by setting the environment variable DEBUG
to true.
If the logs contain any sensitive information, please feel free to share them through a support ticket. To open a support ticket, please refer to the guide here. In the meantime, please feel free to reach out if you have any other questions!
Hi @sallahbaksh. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.
Hi @sallahbaksh, we're sending this friendly reminder because we haven't heard back from you in 7 days. We need more information about this issue to help address it. Please be sure to give us your input. If we don't hear back from you within 14 days of this comment the issue will be automatically closed. Thank you!
I've created a support ticket through Azure
@sallahbaksh did you get the support needed? I didn't hear from the support team about your ticket.
Describe the bug I am attempting to stream the GPT response within a PTU deployment environment, but no response is bring returned by the model. The content sent to GPT includes 35 messages, 15 of which are png images that have been converted into a base64 string and the rest is regular text. It works correctly without any errors when I set stream to false.
To Reproduce Steps to reproduce the behavior: const client = new AzureOpenAI({apiKey: apiKey, apiVersion: apiVersion, endpoint: endpoint, deployment: deploymentName}); const results = await client.chat.completions.create({ stream: true, messages: messageText, model: modelName, max_tokens: maxTokens, seed: seed });
for await (const chunk of results) { let message = ""; for (const choice of chunk.choices) { console.log(
Chunk: ${choice.delta?.content}
); if (choice.delta.content !== null) { streamedMessage += choice.delta.content; } }Expected behavior I except a response to be returned by the model when I set stream to true.