microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.31k stars 3.13k forks source link

.Net: Proposal to Allow Injection of KernelArguments to the Function Auto Invocation Process #5390

Open ebigunso opened 6 months ago

ebigunso commented 6 months ago

In the existing framework, the mechanism for auto-invoking kernel functions lacks the capability to accept KernelArguments from the outside as an argument(See Here). This limitation prevents the passing of predefined variables via the KernelArguments class to automatically invoked functions.

Proposed Solution: I suggest enhancing the auto invocation process by integrating KernelArguments as a parameter in key functions, specifically:

Additionally, KernelArguments should also be handled in any other relevant functions that are part of the auto invocation workflow, which currently does not pass them on.

Krzysztof318 commented 6 months ago

Look here: https://github.com/microsoft/semantic-kernel/blob/4fb351dc1569044ba46189c1fd9c6b0ec1dba562/dotnet/samples/KernelSyntaxExamples/Example59_OpenAIFunctionCalling.cs#L55

WriteLine("======== Example 1: Use automated function calling with a non-streaming prompt ========");
        {
            OpenAIPromptExecutionSettings settings = new() { ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions };
            WriteLine(await kernel.InvokePromptAsync("Given the current time of day and weather, what is the likely color of the sky in Boston?", new KernelArguments(settings)));
            WriteLine();
        }

You are passing KernelArguments in second parameter of kernel.InvokePromptAsync method. If you want to use chat completion and kernel arguments you can create own chat completion service with chat prompt like here https://github.com/microsoft/semantic-kernel/blob/4fb351dc1569044ba46189c1fd9c6b0ec1dba562/dotnet/samples/KernelSyntaxExamples/Example63_ChatCompletionPrompts.cs

ebigunso commented 6 months ago

The problem I am trying to illustrate here, is that the KernelArguments passed to the kernel.InvokePromptAsync is not passed on to GetChatMessageContentsAsync. KernelArguments are passed on until KernelFunctionFromPrompt.InvokeCoreAsync here or KernelFunctionFromMethod.InvokeCoreAsync here, but it stops there. If the KernelArguments are passed on to GetChatMessageContentsAsync, then the auto invoked functions would be able to read arguments supplied from KernelArguments. This would allow passing on arguments programmatically to the auto invoked functions, for those you don't want any alterations to be made by the AI. Also you can use this to supply a multitude of supplementary context information to be used by the auto invoked functions as well, if needed.

Krzysztof318 commented 6 months ago

Curious what SK team say about you proposition.

Now as a workaround you can use EnabledFunction instead of AutoInvoke and call function with parameters manually.

ebigunso commented 6 months ago

Now as a workaround you can use EnabledFunction instead of AutoInvoke and call function with parameters manually.

Yes, that is exactly the workaround I am using now. This proposed change should make life a lot easier for anyone who needs to supply arguments unaltered, and also want to utilize the power of function calling.

ips-jm commented 4 months ago

Is there an ETA for this?

dmytrostruk commented 4 months ago

@ebigunso @ips-jm I think this is implemented on purpose, since we are fetching these arguments from LLM response here: https://github.com/microsoft/semantic-kernel/blob/0296329886eb2116a66e5362f2cc72b42ee30157/dotnet/src/Connectors/Connectors.OpenAI/AzureSdk/ClientCore.cs#L399-L403

Otherwise, if there would be LLM response and KernelArguments, there may be some collisions and it's not clear which source of arguments to use as primary. Also, it's not clear, how to support configuration in automatic function calling scenario to use LLM arguments for Function1 and KernelArguments for Function2.

But, if you want to override KernelArguments for specific function before invocation, you can achieve this with Filters. New version of filters was merged to main branch and will be released next week. Here is an example how to achieve this: https://github.com/microsoft/semantic-kernel/blob/0296329886eb2116a66e5362f2cc72b42ee30157/dotnet/samples/KernelSyntaxExamples/Example76_Filters.cs#L202-L211

ips-jm commented 4 months ago

@dmytrostruk Thank you for pointing this out! Do I understand correctly that these filters are applied once globally with no adaption possible once in the services? This would not really fit my usecase since I potentially need to "auto-call" multiple functions with a user id or session id that are received with an incoming chat request.

dmytrostruk commented 4 months ago

Do I understand correctly that these filters are applied once globally with no adaption possible once in the services?

@ips-jm They are applied globally through DI or on Kernel instance using following properties:

You can remove or reorder them in runtime.

This would not really fit my usecase since I potentially need to "auto-call" multiple functions with a user id or session id that are received with an incoming chat request.

In this case, before executing automatic function calling, you can use kernel.Data property to set data associated with specific request: image

As you can see, in my filter I'm accessing data associated with specific request by using context.Kernel.Data property: image

ips-jm commented 4 months ago

Thank you!

github-actions[bot] commented 1 month ago

This issue is stale because it has been open for 90 days with no activity.