openai / openai-dotnet

The official .NET library for the OpenAI API
https://www.nuget.org/packages/OpenAI
MIT License
1.55k stars 162 forks source link

missing_required_parameter response_format.json_schema using example code in Unity (C# 9.0) #205

Closed Swah closed 2 months ago

Swah commented 3 months ago

Confirm this is not an issue with the OpenAI Python Library

Confirm this is not an issue with the underlying OpenAI API

Confirm this is not an issue with Azure OpenAI

Describe the bug

I'm always getting the same error message as soon as I try to use any variation of ChatResponseFormat.CreateJsonSchemaFormat

Couldn't get response from ChatGPT: System.ClientModel.ClientResultException: HTTP 400 (invalid_request_error: missing_required_parameter)
Parameter: response_format.json_schema
Missing required parameter: 'response_format.json_schema'.
  at OpenAI.ClientPipelineExtensions.ProcessMessageAsync (System.ClientModel.Primitives.ClientPipeline pipeline, System.ClientModel.Primitives.PipelineMessage message, System.ClientModel.Primitives.RequestOptions options) [0x0015e] in <92cde0f04c3a467caa9bf3e15fe4f0c8>:0 
  at System.Threading.Tasks.ValueTask`1[TResult].get_Result () [0x0001b] in <27124aa0e30a41659b903b822b959bc7>:0 
  at OpenAI.Chat.ChatClient.CompleteChatAsync (System.ClientModel.BinaryContent content, System.ClientModel.Primitives.RequestOptions options) [0x000ad] in <92cde0f04c3a467caa9bf3e15fe4f0c8>:0 
  at OpenAI.Chat.ChatClient.CompleteChatAsync (System.Collections.Generic.IEnumerable`1[T] messages, OpenAI.Chat.ChatCompletionOptions options, System.Threading.CancellationToken cancellationToken) [0x00152] in <92cde0f04c3a467caa9bf3e15fe4f0c8>:0 

In the code below, if I use chatOptions.ResponseFormat = ChatResponseFormat.JsonObject, I don't get an error message, but then I of course don't get to specify the schema.

This user was also having issues in Unity, but the error message was unclear so it's hard to know if it's the same issue. Plus whatever fixed it should be documented.

To Reproduce

Run the code below in Unity 2022.3.22f1 Library installed using NuGet for Unity

Code snippets

var client = new OpenAI.Chat.ChatClient(model: model.Or(DEFAULT_CHAT_GPT_MODEL), apiKey);
var chatOptions = new ChatCompletionOptions();
chatOptions.ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat(
    name: "math_reasoning",
    jsonSchema: BinaryData.FromString(
        "{\n" +
        "    \"type\": \"object\",\n" +
        "    \"properties\": {\n" +
        "        \"steps\": {\n" +
        "            \"type\": \"array\",\n" +
        "            \"items\": {\n" +
        "                \"type\": \"object\",\n" +
        "                \"properties\": {\n" +
        "                    \"explanation\": { \"type\": \"string\" },\n" +
        "                    \"output\": { \"type\": \"string\" }\n" +
        "                },\n" +
        "                \"required\": [\"explanation\", \"output\"],\n" +
        "                \"additionalProperties\": false\n" +
        "            }\n" +
        "        },\n" +
        "        \"final_answer\": { \"type\": \"string\" }\n" +
        "    },\n" +
        "    \"required\": [\"steps\", \"final_answer\"],\n" +
        "    \"additionalProperties\": false\n" +
        "}"),
    strictSchemaEnabled: true);
var chatMessage = new UserChatMessage(query);
return client.CompleteChatAsync(new List<UserChatMessage> { chatMessage }, chatOptions)

OS

winOS

.NET version

.NET Standard 2.1

Library version

2.0.0-beta.10

Swah commented 3 months ago

Just adding that using function calls seems to work (see https://github.com/openai/openai-dotnet/issues/193)

private ChatTool GetSchema<T>() => ChatTool.CreateFunctionTool(
    functionName: nameof(T),
    functionDescription: "This is a format to create novel charater which user wants.",
    functionParameters: BinaryData.FromString(CreateSchema<T>()));

ChatCompletionOptions options = new()
{
    Tools = { GetSchema<T>() },
    ToolChoice = ChatToolChoice.Required,
};

client.CompleteChatAsync(new List<UserChatMessage> { chatMessage }, options)

So the issue could potentially be with ChatResponseFormat.CreateJsonSchemaFormat, or with how chat responses are handled internally compared to function calls. It seems like using Unity / C# 9.0 could be involved as well.

joseharriaga commented 2 months ago

Thank you for reaching out, @Swah ! Based on the information that you and @helloSalmon provided here and in the other thread (https://github.com/openai/openai-dotnet/issues/193), we were able to identify the issue and track it back to this issue in the .NET runtime: πŸ”— https://github.com/dotnet/runtime/issues/103365.

I implemented a mitigation as part of this PR: πŸ”— https://github.com/openai/openai-dotnet/pull/206. I confirmed that this fixes the problem in Unity. ChatResponseFormat works as expected now.

We also just pushed a release today, so you can go grab this fix starting with version 2.0.0-beta.11: πŸ”— https://www.nuget.org/packages/OpenAI/2.0.0-beta.11

Swah commented 2 months ago

@joseharriaga that was one of the quickest resolution on a ticket I've experienced, thanks :)

I can confirm beta.11 seems to fix the issue for me, using chat completion instead of function calls. Thanks again!