microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
20.35k stars 2.96k forks source link

.Net Bug: In Blazor - Unhandled Exception: System.NotSupportedException: Synchronous reads are not supported, use ReadAsync instead. #6994

Open AshD opened 5 days ago

AshD commented 5 days ago

Describe the bug In .NET 8 this works fine but under Blazor, Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync throws an exception.

Unhandled Exception: System.NotSupportedException: Synchronous reads are not supported, use ReadAsync instead. at System.Net.Http.WasmHttpReadStream.Read(Byte[] buffer, Int32 offset, Int32 count) at System.IO.DelegatingStream.Read(Byte[] buffer, Int32 offset, Int32 count) at Azure.Core.Pipeline.ReadTimeoutStream.Read(Byte[] buffer, Int32 offset, Int32 count) at System.IO.Stream.CopyTo(Stream destination, Int32 bufferSize) at System.IO.Stream.CopyTo(Stream destination) at Azure.RequestFailedException.BufferResponseIfNeeded(Response response) at Azure.RequestFailedException.GetRequestFailedExceptionContent(Response response, RequestFailedDetailsParser parser) at Azure.RequestFailedException..ctor(Response response, Exception innerException, RequestFailedDetailsParser detailsParser) at Azure.RequestFailedException..ctor(Response response, Exception innerException) at Azure.RequestFailedException..ctor(Response response) at Azure.Core.HttpPipelineExtensions.ProcessMessageAsync(HttpPipeline pipeline, HttpMessage message, RequestContext requestContext, CancellationToken cancellationToken) at Azure.AI.OpenAI.OpenAIClient.GetChatCompletionsStreamingAsync(ChatCompletionsOptions chatCompletionsOptions, CancellationToken cancellationToken) at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.d__551[[Azure.AI.OpenAI.StreamingResponse1[[Azure.AI.OpenAI.StreamingChatCompletionsUpdate, Azure.AI.OpenAI, Version=1.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]], Azure.AI.OpenAI, Version=1.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8]].MoveNext() at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext() at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource.GetResult()

To Reproduce Call Microsoft.SemanticKernel.Connectors.**OpenAI.ClientCore.GetStreamingChatMessageContentsAsync running as a Blazor WASM app.

Expected behavior Works like it does in regular .NET 8

Platform

Additional context I passed it a new HttpClient like this var httpClient = new HttpClient { BaseAddress = new Uri(builder.HostEnvironment.BaseAddress) };

AshD commented 5 days ago

I am creating the completion service to OpenAI like this (and it works in a Windows WPF app) and throws an exception in Blazor WASM app.

var httpClient = new HttpClient { BaseAddress = new Uri(builder.HostEnvironment.BaseAddress) }; chatCompletionService = new OpenAIChatCompletionService(AIService.ModelName, AIService.Key,null,HttpClient);

AshD commented 4 days ago

Some more info: The Azure.AI.OpenAI dll 1.0.0 beta 17 is not sending the correct JSON to the OpenAI endpoint under Blazor. It is missing the Content field in the messages sent.

{"messages":[{"role":"system"},{"role":"user"}],"max_tokens":30000,"temperature":0,"top_p":1,"n":1,"stop":["user:","User:","Question:","ZZ"],"presence_penalty":0,"frequency_penalty":0,"stream":true,"model":"gpt-4o"}