Open aitrailblazer opened 2 months ago
Same for me. It worked fine before.
It's a problem with dependencies. Seems Semantic Kernel and SmartComponent use diferent versions. I've tried installing semantic kernel on this repo and suddenly it starts failing.
This can help you
Create a new interence backend. In my case is for openai
using OpenAI;
using OpenAI.Chat;
using SmartComponents.StaticAssets.Inference;
using ChatMessageRole = SmartComponents.StaticAssets.Inference.ChatMessageRole;
namespace Application.Infrastructure.SmartComponentsBackend
{
public class SmartComponentsBack(IConfiguration configuration) : IInferenceBackend
{
public async Task<string> GetChatResponseAsync(ChatParameters options)
{
var apiConfig = new ApiConfig(configuration);
var client = CreateClient(apiConfig);
var chatCompletionsOptions = new ChatCompletionOptions
{
Temperature = options.Temperature ?? 0f,
TopP = options.TopP ?? 1,
MaxTokens = options.MaxTokens ?? 200,
FrequencyPenalty = options.FrequencyPenalty ?? 0,
PresencePenalty = options.PresencePenalty ?? 0,
};
var messages = options.Messages?.Select<SmartComponents.StaticAssets.Inference.ChatMessage, OpenAI.Chat.ChatMessage>(message =>
message.Role switch
{
ChatMessageRole.System => new SystemChatMessage(message.Text),
ChatMessageRole.User => new UserChatMessage(message.Text),
ChatMessageRole.Assistant => new AssistantChatMessage(message.Text),
_ => throw new InvalidOperationException("Unknown message role")
}).ToArray();
ChatCompletion completion = client.CompleteChat(messages);
if (options.StopSequences is { } stopSequences)
{
foreach (var stopSequence in stopSequences)
{
chatCompletionsOptions.StopSequences.Add(stopSequence);
}
}
var completionsResponse = await client.CompleteChatAsync(messages, chatCompletionsOptions);
var response = completionsResponse.Value.Content?.FirstOrDefault()?.Text ?? string.Empty;
return response;
}
private static ChatClient CreateClient(ApiConfig apiConfig)
{
var client = new OpenAIClient(apiConfig.ApiKey!);
return client.GetChatClient(apiConfig.DeploymentName);
}
}
}
And then, in the program
services.AddSmartComponents()
.WithInferenceBackend<SmartComponentsBack>();
This is because in version 2, these classes have changed a lot and current implementation will not work.
There is separate initialization for the Open AI key, etc. It should be able to utilize the existing ones in bigger project.
Integrating Blazor server I'm receiving the following error
fail: Microsoft.AspNetCore.Diagnostics.DeveloperExceptionPageMiddleware[1] An unhandled exception has occurred while executing the request. System.TypeLoadException: Could not load type 'Azure.AI.OpenAI.ChatCompletions' from assembly 'Azure.AI.OpenAI, Version=2.0.0.0, Culture=neutral, PublicKeyToken=92742159e12e44c8'. at SmartComponents.Inference.OpenAI.OpenAIInferenceBackend.GetChatResponseAsync(ChatParameters options) at SmartComponents.Inference.SmartTextAreaInference.GetInsertionSuggestionAsync(IInferenceBackend inference, SmartTextAreaConfig config, String textBefore, String textAfter) at Microsoft.AspNetCore.Builder.SmartComponentsServiceCollectionExtensions.AttachSmartComponentsEndpointsStartupFilter.<>c__DisplayClass0_1.<b__3>d.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.AspNetCore.Http.RequestDelegateFactory.ExecuteTaskResult[T](Task`1 task, HttpContext httpContext)
at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.Diagnostics.DeveloperExceptionPageMiddlewareImpl.Invoke(HttpContext context)