Closed iroy2000 closed 1 month ago
@iroy2000 I tried out this scenario here https://github.com/microsoft/semantic-kernel/pull/7285/files and I cannot reproduce the issue. I tried a few different models. Can you take a look at my sample and adjust to match your scenario?
PR #7285
Thanks @markwallace-microsoft
I'm trying to change the code to mimic your PR
Now seems like "a function" is "invoked". But I'm trying to place the Console output in the plugin ( the above LightPlugin
code ) , and it is not displaying. How do I know if a correct function is executed ?
Another update: I don't think it is invoking the correct function, I try to add the following into LightPlugin > ChangeState
method, and I don't see file created either.
File.WriteAllText("/Users/myself/Desktop/lightState.txt", $"[Light is now {state}]");
( For example, in the output below, I see it seems like the function is executing, but how do you know it is the right function )
...
Function Function_3ff789fad6844a8cb1796a61b80c5da8 invoking.
... // and later
Function Function_3ff789fad6844a8cb1796a61b80c5da8 succeeded
Reference of the output
trce: Microsoft.SemanticKernel.KernelPromptTemplate[0]
Extracting blocks from template: <message role="user">Turn on the light</message>
chatSemanticFunction: Function_3ff789fad6844a8cb1796a61b80c5da8
**info: Function_3ff789fad6844a8cb1796a61b80c5da8[0]
Function Function_3ff789fad6844a8cb1796a61b80c5da8 invoking.**
trce: Function_3ff789fad6844a8cb1796a61b80c5da8[0]
Function arguments: {}
trce: Microsoft.SemanticKernel.KernelFunctionFactory[0]
Rendered prompt: <message role="user">Turn on the light</message>
trce: Microsoft.SemanticKernel.Connectors.OpenAI.AzureOpenAIChatCompletionService[0]
ChatHistory: [{"Role":{"Label":"user"},"Items":[{"$type":"TextContent","Text":"Turn on the light"}]}], Settings: {"temperature":1,"top_p":1,"presence_penalty":0,"frequency_penalty":0,"max_tokens":null,"stop_sequences":null,"results_per_prompt":1,"seed":null,"response_format":null,"chat_system_prompt":null,"token_selection_biases":null,"ToolCallBehavior":{"ToolCallResultSerializerOptions":null},"User":null,"logprobs":null,"top_logprobs":null,"service_id":null,"model_id":null}
info: Microsoft.SemanticKernel.Connectors.OpenAI.AzureOpenAIChatCompletionService[0]
Prompt tokens: 73. Completion tokens: 62. Total tokens: 135.
info: Microsoft.SemanticKernel.KernelFunctionFactory[0]
Prompt tokens: 73. Completion tokens: 62.
**info: Function_3ff789fad6844a8cb1796a61b80c5da8[0]
Function Function_3ff789fad6844a8cb1796a61b80c5da8 succeeded.**
trce: Function_3ff789fad6844a8cb1796a61b80c5da8[0]
Function result: Here's an example of how to use the `Light-ChangeState` function to turn on a light:
lightChangeState({
newState: true
})
This assumes that `lightChangeState` is assigned with a function that implements the `Light-ChangeState` type.
info: Function_3ff789fad6844a8cb1796a61b80c5da8[0]
Function completed. Duration: 2.7151387s
Sure, here's an example implementation using the `Light-ChangeState` function:
const changeLightState: functions.Light-ChangeState = ({ newState }) => {
// implementation to change the state of the light
}
changeLightState({ newState: true }); // turns on the light
changeLightState({ newState: false }); // turns off the light
Here is the origin code I'm using originally https://learn.microsoft.com/en-us/semantic-kernel/get-started/quick-start-guide?pivots=programming-language-csharp
Hi @iroy2000 I've updated my sample to include a function calling filter which logs the function calls. You should be seeing two function calls.
Function Function_bffff963b1cc4693a99e80693667b0fa is being invoked with arguments: {}
This is the anonymous function that is created when you invoke a prompt
Function ChangeState is being invoked with arguments: {"newState":"True"}
This is the function call made in response to the LLM requesting a tool call
I don't think you are getting this second one. Can you add logging of the messages being sent to the LLM so we can compare?
Here is the full trace
Function Function_bffff963b1cc4693a99e80693667b0fa is being invoked with arguments: {}
=== REQUEST ===
{
"messages": [
{
"content": "Turn on the light?",
"role": "user"
}
],
"temperature": 1,
"top_p": 1,
"n": 1,
"presence_penalty": 0,
"frequency_penalty": 0,
"model": "gpt-4o",
"tools": [
{
"function": {
"name": "LightPlugin-GetState",
"description": "Gets the state of the light.",
"parameters": {
"type": "object",
"required": [],
"properties": {}
}
},
"type": "function"
},
{
"function": {
"name": "LightPlugin-ChangeState",
"description": "Changes the state of the light.\u0027",
"parameters": {
"type": "object",
"required": [
"newState"
],
"properties": {
"newState": {
"type": "boolean"
}
}
}
},
"type": "function"
}
],
"tool_choice": "auto"
}
=== RESPONSE ===
{
"id": "chatcmpl-9mRAEGuJJIco3eu6ioF61EJ4T4cn1",
"object": "chat.completion",
"created": 1721331190,
"model": "gpt-4o-2024-05-13",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"tool_calls": [
{
"id": "call_Xn1FRw9Uf86SWnoSVyeeTQni",
"type": "function",
"function": {
"name": "LightPlugin-GetState",
"arguments": "{}"
}
}
]
},
"logprobs": null,
"finish_reason": "tool_calls"
}
],
"usage": {
"prompt_tokens": 71,
"completion_tokens": 13,
"total_tokens": 84
},
"system_fingerprint": "fp_c4e5b6fa31"
}
Function GetState is being invoked with arguments: {}
=== REQUEST ===
{
"messages": [
{
"content": "Turn on the light?",
"role": "user"
},
{
"content": null,
"tool_calls": [
{
"function": {
"name": "LightPlugin-GetState",
"arguments": "{}"
},
"type": "function",
"id": "call_Xn1FRw9Uf86SWnoSVyeeTQni"
}
],
"role": "assistant"
},
{
"content": "off",
"tool_call_id": "call_Xn1FRw9Uf86SWnoSVyeeTQni",
"role": "tool"
}
],
"temperature": 1,
"top_p": 1,
"n": 1,
"presence_penalty": 0,
"frequency_penalty": 0,
"model": "gpt-4o",
"tools": [
{
"function": {
"name": "LightPlugin-GetState",
"description": "Gets the state of the light.",
"parameters": {
"type": "object",
"required": [],
"properties": {}
}
},
"type": "function"
},
{
"function": {
"name": "LightPlugin-ChangeState",
"description": "Changes the state of the light.\u0027",
"parameters": {
"type": "object",
"required": [
"newState"
],
"properties": {
"newState": {
"type": "boolean"
}
}
}
},
"type": "function"
}
],
"tool_choice": "auto"
}
=== RESPONSE ===
{
"id": "chatcmpl-9mRAEZDQf5eE2TX6Ck3YN222K7ds1",
"object": "chat.completion",
"created": 1721331190,
"model": "gpt-4o-2024-05-13",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"tool_calls": [
{
"id": "call_ssRBa870MroK9UqL5wEaA4Lf",
"type": "function",
"function": {
"name": "LightPlugin-ChangeState",
"arguments": "{\"newState\":true}"
}
}
]
},
"logprobs": null,
"finish_reason": "tool_calls"
}
],
"usage": {
"prompt_tokens": 96,
"completion_tokens": 18,
"total_tokens": 114
},
"system_fingerprint": "fp_c4e5b6fa31"
}
Function ChangeState is being invoked with arguments: {"newState":"True"}
=== REQUEST ===
{
"messages": [
{
"content": "Turn on the light?",
"role": "user"
},
{
"content": null,
"tool_calls": [
{
"function": {
"name": "LightPlugin-GetState",
"arguments": "{}"
},
"type": "function",
"id": "call_Xn1FRw9Uf86SWnoSVyeeTQni"
}
],
"role": "assistant"
},
{
"content": "off",
"tool_call_id": "call_Xn1FRw9Uf86SWnoSVyeeTQni",
"role": "tool"
},
{
"content": null,
"tool_calls": [
{
"function": {
"name": "LightPlugin-ChangeState",
"arguments": "{\u0022newState\u0022:true}"
},
"type": "function",
"id": "call_ssRBa870MroK9UqL5wEaA4Lf"
}
],
"role": "assistant"
},
{
"content": "on",
"tool_call_id": "call_ssRBa870MroK9UqL5wEaA4Lf",
"role": "tool"
}
],
"temperature": 1,
"top_p": 1,
"n": 1,
"presence_penalty": 0,
"frequency_penalty": 0,
"model": "gpt-4o",
"tools": [
{
"function": {
"name": "LightPlugin-GetState",
"description": "Gets the state of the light.",
"parameters": {
"type": "object",
"required": [],
"properties": {}
}
},
"type": "function"
},
{
"function": {
"name": "LightPlugin-ChangeState",
"description": "Changes the state of the light.\u0027",
"parameters": {
"type": "object",
"required": [
"newState"
],
"properties": {
"newState": {
"type": "boolean"
}
}
}
},
"type": "function"
}
],
"tool_choice": "auto"
}
=== RESPONSE ===
{
"id": "chatcmpl-9mRAFUnI2BBQKaSYRaxeeeN0kQkKr",
"object": "chat.completion",
"created": 1721331191,
"model": "gpt-4o-2024-05-13",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "The light has been turned on."
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 126,
"completion_tokens": 8,
"total_tokens": 134
},
"system_fingerprint": "fp_c4e5b6fa31"
}
The light has been turned on.
Hi @iroy2000 if you need more support please re-open this issue. Thanks.
Here is the LightPlugin.cs I'm using as my example
And in my Program.cs, here is the main gist of setting up the kernel and plugin
It seems understand the context when I try to ask question
But instead of invoking the function, it just tell me what function to call.
I'm wondering if there is any config / settings I'm missing. Thanks.
I'm using
gpt-35-turbo