microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
20.4k stars 2.96k forks source link

.Net: Bug: "Missing required parameter: 'tools[0].function'. #6825

Open nor0x opened 2 weeks ago

nor0x commented 2 weeks ago

Repro is here: BlazorApp1.zip

Describe the bug I'm using Semantic Kernel with some KernelFunctions in a Blazor WebAssembly app. The kernel is built the following way with a registration of the Plugin:

var kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.AddOpenAIChatCompletion(_textModel, _apiKey, httpClient: new HttpClient());
kernelBuilder.Plugins.AddFromType<MyPlugin>();
var kernel = kernelBuilder.Build();

The Plugin in the repro is very basic and just logs stuff to the console


[KernelFunction, Description("Log message to console")]
public void Log(string message)
{
    Console.WriteLine(message);
}

And this code is used to send the prompt to the LLM:

var completionService = _kernel.GetRequiredService<IChatCompletionService>();

var result = await completionService.GetChatMessageContentAsync(
    $"Log '{Guid.NewGuid().ToString("N")}' to console!", 
    new OpenAIPromptExecutionSettings()
    {
        ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions,
        ChatSystemPrompt = _onlyFunctionPrompt,
        Temperature = 0.4,
    }
    , _kernel);

The call to GetChatMessageContentAsync produces the following error message:

crit: Microsoft.AspNetCore.Components.WebAssembly.Rendering.WebAssemblyRenderer[100]
      Unhandled exception rendering component: Missing required parameter: 'tools[0].function'.
      Status: 400 (Bad Request)
      ErrorCode: missing_required_parameter

      Content:
      {
        "error": {
          "message": "Missing required parameter: 'tools[0].function'.",
          "type": "invalid_request_error",
          "param": "tools[0].function",
          "code": "missing_required_parameter"
        }
      }

      Headers:
      Content-Length: 199
      Content-Type: application/json

Microsoft.SemanticKernel.HttpOperationException: Missing required parameter: 'tools[0].function'.
Status: 400 (Bad Request)
ErrorCode: missing_required_parameter

Content:
{
  "error": {
    "message": "Missing required parameter: 'tools[0].function'.",
    "type": "invalid_request_error",
    "param": "tools[0].function",
    "code": "missing_required_parameter"
  }
}

Headers:
Content-Length: 199
Content-Type: application/json

 ---> Azure.RequestFailedException: Missing required parameter: 'tools[0].function'.
Status: 400 (Bad Request)
ErrorCode: missing_required_parameter

Content:
{
  "error": {
    "message": "Missing required parameter: 'tools[0].function'.",
    "type": "invalid_request_error",
    "param": "tools[0].function",
    "code": "missing_required_parameter"
  }
}

Headers:
Content-Length: 199
Content-Type: application/json

   at Azure.Core.HttpPipelineExtensions.ProcessMessageAsync(HttpPipeline pipeline, HttpMessage message, RequestContext requestContext, CancellationToken cancellationToken)
   at Azure.AI.OpenAI.OpenAIClient.GetChatCompletionsAsync(ChatCompletionsOptions chatCompletionsOptions, CancellationToken cancellationToken)
   at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.<RunRequestAsync>d__53`1[[Azure.Response`1[[Azure.AI.OpenAI.ChatCompletions, Azure.AI.OpenAI, Version=1.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]], Azure.Core, Version=1.39.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]].MoveNext()
   --- End of inner exception stack trace ---
   at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.<RunRequestAsync>d__53`1[[Azure.Response`1[[Azure.AI.OpenAI.ChatCompletions, Azure.AI.OpenAI, Version=1.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]], Azure.Core, Version=1.39.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]].MoveNext()
   at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetChatMessageContentsAsync(ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)
   at Microsoft.SemanticKernel.ChatCompletion.ChatCompletionServiceExtensions.GetChatMessageContentAsync(IChatCompletionService chatCompletionService, String prompt, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)

Additional context please note that a new HttpClient is used because the app is running on Blazor WebAssembly and there is this bug here https://github.com/microsoft/semantic-kernel/issues/1792

I was not able to find a workaround yet, so this is currently a blocker for us

dmytrostruk commented 2 weeks ago

@nor0x Thanks for reporting this issue. I tried to run your code and I wasn't able to reproduce this error using gpt-4 model, I got new Guid in console. Could you please specify which AI model and version of Semantic Kernel you are using? Thanks!

nor0x commented 2 weeks ago

thanks for your reply. I'm also using gpt-4 and I have updated the Nugets in my repro project which are the following:

    <PackageReference Include="Microsoft.AspNetCore.Components.WebAssembly" Version="8.0.6" />
    <PackageReference Include="Microsoft.AspNetCore.Components.WebAssembly.DevServer" Version="8.0.6" PrivateAssets="all" />

    <PackageReference Include="Microsoft.KernelMemory.AI.OpenAI" Version="0.64.240619.1" />
    <PackageReference Include="Microsoft.KernelMemory.Core" Version="0.64.240619.1" />
    <PackageReference Include="Microsoft.KernelMemory.SemanticKernelPlugin" Version="0.64.240619.1" />
    <PackageReference Include="Microsoft.SemanticKernel.Plugins.Memory" Version="1.13.0-alpha" />
    <PackageReference Include="Microsoft.SemanticKernel" Version="1.15.0" />

I still get the error mentioned above. Here is a screenshot: image

you cannot reproduce it with the repro project @dmytrostruk ?

JasonHaley commented 1 week ago

I can recreate the issue and it doesn't seem to be Semantic Kernel's code, but Azure.AI.OpenAIClient 1.0.0-beta.17.

The problem may be due to it running in web assembly.

It is failing in Semantic Kernel Connectors.OpenAI.ClientCore.cs line 403 where it looks like it is supposed to serialize the function. image

However the request going to OpenAI is missing the function's metadata due to some failure that must be handled and not bubble out: image