microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.36k stars 3.14k forks source link

.Net: Add Microsoft.SemanticKernel.Connectors.OpenAI metrics to existing instance of ILogger #5917

Closed gdodd1977 closed 5 months ago

gdodd1977 commented 5 months ago

I'm trying to hook up the features listed in this blog post: https://devblogs.microsoft.com/semantic-kernel/track-your-token-usage-and-costs-with-semantic-kernel/

I have an instance of ILogger that writes to Kusto clusters. How can I use this new Microsoft.SemanticKernel.Connectors.OpenAI feature to log those tokens into my kusto cluster?

I tried this, with no luck: builder.Services.AddLogging(loggingBuilder => { loggingBuilder.AddDebug(); loggingBuilder.AddTraceSource("Microsoft.SemanticKernel.Connectors.OpenAI"); loggingBuilder.Services.AddSingleton(factory => factory.GetRequiredService<ILoggerFactory>()); });

There isn't really any documentation that I can see other than how to hook into AppInsights. I'm just not sure how to hook it into my existing ILogger that writes to Kusto.

dmytrostruk commented 5 months ago

@gdodd1977 Could you please share how you initialize OpenAI Connector? If you use AddOpenAIChatCompletion for example, things like HttpClient and ILoggerFactory should be injected in OpenAIChatCompletionService from service provider: https://github.com/microsoft/semantic-kernel/blob/c8ce2492acd0df0ba1d0bb0368645bc9f5e2a8e1/dotnet/src/Connectors/Connectors.OpenAI/OpenAIServiceCollectionExtensions.cs#L988

Also, make sure LogLevel.Information is enabled, because we use this log level to log token usage information: https://github.com/microsoft/semantic-kernel/blob/c8ce2492acd0df0ba1d0bb0368645bc9f5e2a8e1/dotnet/src/Connectors/Connectors.OpenAI/AzureSdk/ClientCore.cs#L1238-L1243

Please let me know if that helps, thank you!

gdodd1977 commented 5 months ago

We have our own concrete classes that derive from IChatCompletionService because we have a custom OpenAI client. We don't use that AddOpenAIChatCompletion extension. But we register it similar to the extension method you all have:

image

Just browsing the code, I'd wager we are out of luck and need to just roll our own logging similar to what you all are doing in your ChatCompletionService.

dmytrostruk commented 5 months ago

Just browsing the code, I'd wager we are out of luck and need to just roll our own logging similar to what you all are doing in your ChatCompletionService.

@gdodd1977 Yes, if you use your own ChatCompletionService, you need to make sure that you log everything you need on your side, especially if it's number of tokens.

One more recommendation - before writing to Kusto, try to output your logs locally in Console first. You should be able to see all logs there. In this case it will be easier to understand what you should expect to see in Kusto.

I'm going to close this issue for now, but please let me know in case you have further questions. Thanks!