Closed AustinZzx closed 9 months ago
Hi @AustinZzx, thank you for raising the issue. To better assist you, can you share the model you are using and the logging information from the SDK? To enable logging, please refer to the Troubleshooting section in the Readme.md here
I was using gpt-3.5-turbo-1106.
Logging information:
azure:openai:info Response status code: 200
azure:openai:info Headers: {
"cache-control": "no-cache, must-revalidate",
"transfer-encoding": "chunked",
"content-type": "text/event-stream",
"access-control-allow-origin": "*",
"apim-request-id": "2af11589-4012-4076-9948-258d04697417",
"strict-transport-security": "max-age=31536000; includeSubDomains; preload",
"x-content-type-options": "nosniff",
"x-ms-region": "West US",
"x-ratelimit-remaining-requests": "999",
"x-ratelimit-remaining-tokens": "999939",
"x-accel-buffering": "no",
"x-request-id": "99666d90-8d45-4395-bb81-7b82b066ff3c",
"x-ms-client-request-id": "1d15e516-7c80-46d8-bbc1-6145077649d9",
"azureml-model-session": "d012-20240105195133",
"date": "Mon, 22 Jan 2024 22:57:10 GMT"
}
azure:core-rest-pipeline retryPolicy:info Retry 0: Received a response from request 1d15e516-7c80-46d8-bbc1-6145077649d9
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing 2 retry strategies.
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing retry strategy throttlingRetryStrategy.
azure:core-rest-pipeline retryPolicy:info Retry 0: Skipped.
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing retry strategy exponentialRetryStrategy.
azure:core-rest-pipeline retryPolicy:info Retry 0: Skipped.
azure:core-rest-pipeline retryPolicy:info None of the retry strategies could work with the received response. Returning it.
Error in gpt stream: TypeError: Cannot read properties of undefined (reading 'map')
at getChatCompletionsResult (/Users/zexiazhang/Desktop/retell/conversational-ai/node_modules/@azure/openai/src/api/client/openAIClient/deserializers.ts:82:22)
at Object.transform (/Users/zexiazhang/Desktop/retell/conversational-ai/node_modules/@azure/openai/src/api/oaiSse.ts:32:9)
at ensureIsPromise (node:internal/webstreams/util:185:19)
at transformStreamDefaultControllerPerformTransform (node:internal/webstreams/transformstream:505:18)
at transformStreamDefaultSinkWriteAlgorithm (node:internal/webstreams/transformstream:555:10)
at Object.write (node:internal/webstreams/transformstream:360:14)
at ensureIsPromise (node:internal/webstreams/util:185:19)
at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1109:5)
at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1224:5)
at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1098:3)
Hi @AustinZzx, thank you for providing the logging! The deserializer method seems to access an undefined
property. I opened a PR with the fix in https://github.com/Azure/azure-sdk-for-js/pull/28352. I'll keep you updated when there is a new release. In the meantime, please let us know if you have any other questions!
Hi @AustinZzx, @azure/openai@1.0.0-beta.11 has been released with the fix. Please give it a try and let us know if you have any other questions. Thank you for your patience!
Hi @AustinZzx. Thank you for opening this issue and giving us the opportunity to assist. We believe that this has been addressed. If you feel that further discussion is needed, please add a comment with the text "/unresolve" to remove the "issue-addressed" label and continue the conversation.
Hi @AustinZzx, since you haven’t asked that we /unresolve
the issue, we’ll close this out. If you believe further discussion is needed, please add a comment /unresolve
to reopen the issue.
Describe the bug When GPT wants to return two function calls in one response in streaming, the sdk would throw this type error.
To Reproduce Steps to reproduce the behavior:
let result = await client.streamChatCompletions( Gpt35TurboDeploymentId, [ { role: "user", content: "What's the weather in Chicago and Boston?", }, ], { temperature: 0, maxTokens: 50, tools: [ { type: "function", function: { name: "get_weather", description: "get weather information", parameters: { type: "object", properties: { location: { type: "string", description: "location in format of city, state, nation.", }, }, }, }, }, ], }, );
try { for await (const event of result) { if (event.choices.length >= 1) { let delta = event.choices[0].delta; if (!delta) continue; if (delta.toolCalls.length >= 1) { console.log(delta.toolCalls); } } } } catch (err) { console.error("Error in gpt stream: ", err); }