Describe the bug
There's a problem with the way the Ollama and ONNX connectors deserialize their execution settings that makes them fail when they include non-string properties, for example, when using CreateFunctionFromPromptYaml with a template that includes a "temperature" setting.
The reason is that both OllamaPromptExecutionSettings and OnnxRuntimeGenAIPromptExecutionSettings are missing the JsonNumberHandling serialization attribute, which is present in the PromptExecutionSettings of other connectors. For example:
[JsonNumberHandling(JsonNumberHandling.AllowReadingFromString)]
public class OpenAIPromptExecutionSettings : PromptExecutionSettings
{
...
}
I haven't used them, but I also see that the following execution settings classes are also missing this attribute:
HuggingFacePromptExecutionSettings
OpenAIAudioToTextExecutionSettings
OpenAITextToAudioExecutionSettings
To Reproduce
string prompt = """
name: QuestionAnswer
template: |
Question: {{$question}}
template_format: semantic-kernel
description: Answer any question
input_variables:
- name: question
description: The question.
is_required: true
output_variable:
description: The anwer.
execution_settings:
default:
temperature: 0.6
""";
Kernel kernel = Kernel.CreateBuilder()
.AddOllamaChatCompletion("llama3.2", new Uri("http://localhost:11434"))
//.AddOnnxRuntimeGenAIChatCompletion("phi-3-4K", modelPath: modelPath_4K)
.Build();
KernelFunction function = kernel.CreateFunctionFromPromptYaml(prompt);
var result = await function.InvokeAsync(kernel, new() { { "question", question } });
Console.WriteLine(result.GetValue<string>());
This fails with the following exception:
System.Text.Json.JsonException
HResult=0x80131500
Message=The JSON value could not be converted to System.Nullable`1[System.Single]. Path: $.temperature | LineNumber: 0 | BytePositionInLine: 86.
Source=System.Text.Json
StackTrace:
at System.Text.Json.ThrowHelper.ReThrowWithPath(ReadStack& state, Utf8JsonReader& reader, Exception ex)
at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader& reader, JsonSerializerOptions options, ReadStack& state)
at System.Text.Json.JsonSerializer.ReadFromSpan[TValue](ReadOnlySpan`1 utf8Json, JsonTypeInfo`1 jsonTypeInfo, Nullable`1 actualByteCount)
at System.Text.Json.JsonSerializer.ReadFromSpan[TValue](ReadOnlySpan`1 json, JsonTypeInfo`1 jsonTypeInfo)
at System.Text.Json.JsonSerializer.Deserialize[TValue](String json, JsonSerializerOptions options)
at Microsoft.SemanticKernel.Connectors.Ollama.OllamaPromptExecutionSettings.FromExecutionSettings(PromptExecutionSettings executionSettings)
at Microsoft.SemanticKernel.Connectors.Ollama.OllamaChatCompletionService.<GetChatMessageContentsAsync>d__5.MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.<GetChatCompletionResultAsync>d__20.MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.<InvokeCoreAsync>d__3.MoveNext()
at System.Threading.Tasks.ValueTask`1.get_Result()
at Microsoft.SemanticKernel.KernelFunction.<>c__DisplayClass21_0.<<InvokeAsync>b__0>d.MoveNext()
at Microsoft.SemanticKernel.Kernel.<InvokeFilterOrFunctionAsync>d__34.MoveNext()
at Microsoft.SemanticKernel.Kernel.<OnFunctionInvocationAsync>d__33.MoveNext()
at Microsoft.SemanticKernel.KernelFunction.<InvokeAsync>d__21.MoveNext()
at Program.<<<Main>$>g__InvokePrompt4|0_3>d.MoveNext() in ...
at Program.<<Main>$>d__0.MoveNext() in ...
at Program.<Main>(String[] args)
This exception was originally thrown at this call stack:
System.Text.Json.ThrowHelper.ThrowInvalidOperationException_ExpectedNumber(System.Text.Json.JsonTokenType)
System.Text.Json.Utf8JsonReader.TryGetSingle(out float)
System.Text.Json.Utf8JsonReader.GetSingle()
System.Text.Json.Serialization.Converters.NullableConverter<T>.Read(ref System.Text.Json.Utf8JsonReader, System.Type, System.Text.Json.JsonSerializerOptions)
System.Text.Json.Serialization.Metadata.JsonPropertyInfo<T>.ReadJsonAndSetMember(object, ref System.Text.Json.ReadStack, ref System.Text.Json.Utf8JsonReader)
System.Text.Json.Serialization.Converters.ObjectDefaultConverter<T>.OnTryRead(ref System.Text.Json.Utf8JsonReader, System.Type, System.Text.Json.JsonSerializerOptions, ref System.Text.Json.ReadStack, out T)
System.Text.Json.Serialization.JsonConverter<T>.TryRead(ref System.Text.Json.Utf8JsonReader, System.Type, System.Text.Json.JsonSerializerOptions, ref System.Text.Json.ReadStack, out T, out bool)
System.Text.Json.Serialization.JsonConverter<T>.ReadCore(ref System.Text.Json.Utf8JsonReader, System.Text.Json.JsonSerializerOptions, ref System.Text.Json.ReadStack)
Inner Exception 1:
InvalidOperationException: Cannot get the value of a token type 'String' as a number.
Platform
OS: Windows
IDE: Visual Studio
Language: C#
Source: 1.24.0-alpha (of the Ollama and ONNX connectors)
Describe the bug There's a problem with the way the Ollama and ONNX connectors deserialize their execution settings that makes them fail when they include non-string properties, for example, when using
CreateFunctionFromPromptYaml
with a template that includes a "temperature" setting.The reason is that both
OllamaPromptExecutionSettings
andOnnxRuntimeGenAIPromptExecutionSettings
are missing theJsonNumberHandling
serialization attribute, which is present in the PromptExecutionSettings of other connectors. For example:I haven't used them, but I also see that the following execution settings classes are also missing this attribute:
To Reproduce
This fails with the following exception:
Platform