Closed deyaaeldeen closed 9 months ago
After investigation, this is because the typespec is defined as circular reference.
@doc("A single, role-attributed message within a chat completion interaction.")
model ChatMessage {
@doc("The role associated with this message payload.")
@projectedName("json", "role")
role: ChatRole;
#suppress "@azure-tools/typespec-azure-core/no-nullable" "we explicitly want a nullable string"
@doc("The text associated with this message payload.")
@projectedName("json", "content")
content: string | null;
@doc("""
The name of the author of this message. `name` is required if role is `function`, and it should be the name of the
function whose response is in the `content`. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of
64 characters.
""")
@added(ServiceApiVersions.v2023_07_01_Preview)
name?: string;
@doc("The name and arguments of a function that should be called, as generated by the model.")
@added(ServiceApiVersions.v2023_07_01_Preview)
@projectedName("json", "function_call")
functionCall?: FunctionCall;
@doc("""
Additional context data associated with a chat message when requesting chat completions using compatible Azure
OpenAI chat extensions. This includes information like the intermediate data source retrievals used to service a
request.
This context information is only populated when using Azure OpenAI with chat extensions capabilities configured.
""")
@added(ServiceApiVersions.v2023_08_01_Preview)
@projectedName("csharp", "AzureExtensionsContext")
context?: AzureChatExtensionsMessageContext;
}
@added(ServiceApiVersions.v2023_08_01_Preview)
@doc("""
A representation of the additional context information available when Azure OpenAI chat extensions are involved
in the generation of a corresponding chat completions response. This context information is only populated when
using an Azure OpenAI request configured to use a matching extension.
""")
model AzureChatExtensionsMessageContext {
@doc("""
The contextual message payload associated with the Azure chat extensions used for a chat completions request.
These messages describe the data source retrievals, plugin invocations, and other intermediate steps taken in the
course of generating a chat completions response that was augmented by capabilities from Azure OpenAI chat
extensions.
""")
messages?: ChatMessage[];
}
ChatMessage has a property context with the type of a model AzureChatExtensionsMessageContext which has a property messages with the type of an array of ChatMessage.
As this circular references is not very easy to detect if there's multiple layers in this circular, and current code implementation doesn't record which type is currently being deserialized.
We use hard code for messages/message to cast it as any to work around the ocmpile in PR https://github.com/Azure/autorest.typescript/pull/1986/files#r1299670689
we will need to add a manual deserialize function
function deserializeMessage(message: ChatMessageInRest): ChatMessage {
return {
role: message["role"],
content: message["content"],
name: message["name"],
functionCall: !message.function_call
? undefined
: {
name: message.function_call?.["name"],
arguments: message.function_call?.["arguments"],
},
context: !message.context
? undefined
: {
messages: !message.context.messages
? undefined
: message.context.messages.map((m) => {
return deserializeMessage(m);
}),
},
};
}
@qiaozha Thanks for the quick turnaround! However, just to confirm, is there a plan for a stable fix that takes circular reference into account?
Yeah, I will get to this when https://github.com/Azure/autorest.typescript/pull/1962 is done, because in this PR, we have codegen logic for special deserialization helper. And circular reference would be considered as one of the case.
I can't reproduce anymore, I was able to generate sucessfully the commit above and also this playground which contains @qiaozha models above.
Feel free to re-open if you can get a repro
@joheredi I think you can't repro it because it has been addressed in a hacky way temporarily but it still needs to be addressed in a more principled way. @qiaozha Could you please share an update on progress towards fixing this?
I currently run into this issue and it is blocking the generation for the latest features. To reproduce, use the latest commit id from this PR, currently I'm using c8be3dc86f4dcfd3a2bd4f8a56233ff6212e679e
with 0.19.0 version of the emitter. I'll run into the same error.
Emitter "@azure-tools/typespec-ts" crashed! This is a bug.
Please file an issue at https://github.com/Azure/autorest.typescript/issues
RangeError: Maximum call stack size exceeded
at deserializeResponseValue (file:///home/codespace/workspace/sdk/openai/openai/TempTypeSpecFiles/OpenAI.Inference/node_modules/@azure-tools/typespec-ts/dist/src/modular/helpers/operationHelpers.js:519:34)
at getResponseMapping (file:///home/codespace/workspace/sdk/openai/openai/TempTypeSpecFiles/OpenAI.Inference/node_modules/@azure-tools/typespec-ts/dist/src/modular/helpers/operationHelpers.js:508:57)
at deserializeResponseValue (file:///home/codespace/workspace/sdk/openai/openai/TempTypeSpecFiles/OpenAI.Inference/node_modules/@azure-tools/typespec-ts/dist/src/modular/helpers/operationHelpers.js:537:48)
at getResponseMapping (file:///home/codespace/workspace/sdk/openai/openai/TempTypeSpecFiles/OpenAI.Inference/node_modules/@azure-tools/typespec-ts/dist/src/modular/helpers/operationHelpers.js:508:57)
at getResponseMapping (file:///home/codespace/workspace/sdk/openai/openai/TempTypeSpecFiles/OpenAI.Inference/node_modules/@azure-tools/typespec-ts/dist/src/modular/helpers/operationHelpers.js:492:179)
Pending on a new release of the emitter.
Full error:
Repro steps:
sdk/openai/openai/tsp-location.json
and set the commit to0d3d201639b9e7f3f448bbd08b1fe2fd461ecf49
rushx generate
You can find the typespec in https://github.com/Azure/azure-rest-api-specs/pull/25271
This is high priority to unblock the openai release scheduled later this month.