Open FenkoXD opened 3 weeks ago
Thank you for your feedback. Tagging and routing to the team member best able to assist.
@FenkoXD : If I understand correctly, you're saying that the internal task cancellation that is a normal and expected part of stopping the processor - and which does not surface to callers - is being reported incorrectly by AppInsights. Given that Service Bus has no insight into AppInsights integration nor influence over it, I'm going to transfer over to the Monitor folks for triage.
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @cijothomas @rajkumar-rangaraj @reyang @TimothyMothra @vishweshbankwar.
Hi @FenkoXD, it looks like you're mixing different telemetry SDKs.
This is the classic Application Insights SDK, this is NOT OpenTelemetry. According to the config you shared, this is what's collecting your Dependencies.
`builder.Services.AddApplicationInsightsTelemetry();`
and
`builder.Services.AddSingleton<ITelemetryInitializer, ApplicationInsightsTelemetryInitializer>();`
My team owns this. This is our newer SDK and will replace the former classic Application Insights SDK.
```
.WithMetrics(
meterProviderBuilder => meterProviderBuilder
.AddAzureMonitorMetricExporter(
```
I don't know what this is. It's not something my team owns and I can't comment.
builder.Services.AddApplicationInsightsKubernetesEnricher();
I recommend you choose either Application Insights SDK or OpenTelemetry Azure Monitor Exporter, but not both.
@TimothyMothra, thnx for your response.
We will try to use 2 exporters separately and check dependencies.
Can we expect Insights and OpenTelemetry separately will gather same amount of dependencies telemetry? I mean are here any metrics witch could be missed in one or another sdk?
Can we expect Insights and OpenTelemetry separately will gather same amount of dependencies telemetry?
Yes
I mean are here any metrics witch could be missed in one or another sdk?
No metrics should be missed.
Please review our migration guide for notes on migrating from Application Insights to OpenTelemetry.
If you have any problems with migrating, please file an Azure Support Request. You can do so by going online to the Azure portal and submitting a support request. Access to subscription management and billing support is included with your Microsoft Azure subscription, and technical support is provided through one of the Azure Support Plans. For step-by-step guidance for the Azure portal, see How to create an Azure support request.
@TimothyMothra
We've tried use only Open Telemetry metrics, and now Service Bus failures disappear. It`s great.
But we notices another issue. This is not related with topic, but mb you will suggests should I report this as a feature request? Using OpenTelemetry logging I saw we can configure Scoped Logging:
.AddOpenTelemetry(loggingOptions =>
{
loggingOptions.IncludeFormattedMessage = true;
loggingOptions.IncludeScopes = true;
loggingOptions.ParseStateValues = true;
. . .
}
If I understood it correctly, In case Logger\<MyClass>.Information("Some comment..."), I should see MyClass somwhere in log entry, but It is absent. Is it possible to add source context to logs? Exp log entry:
Here is example how it solved in Serilog logging, is it possible to include Source Context information in Open Telemetry logs?
If I understood it correctly, In case Logger
.Information("Some comment..."), I should see MyClass somwhere in log entry, but It is absent. Is it possible to add source context to logs?
Yes! ILogger calls this the "CategoryName". Support for this was added in our latest beta: https://www.nuget.org/packages/Azure.Monitor.OpenTelemetry.Exporter/1.4.0-beta.1 We're targeting November for our next stable release.
@TimothyMothra thank you for clarifications. I think I can close the issue.
Sad to report but after enabling "Azure.Experimental.EnableActivitySource" to track Azure Activity we start receiving dependency fail as in first post. Here is telemetry config
AppContext.SetSwitch("Azure.Experimental.EnableActivitySource", true);
builder.Services
.AddOpenTelemetry()
.ConfigureResource(
resourceBuilder => resourceBuilder
.AddService(serviceName: Build.Name, serviceInstanceId: Environment.MachineName))
.WithTracing(resourceBuilder =>
{
resourceBuilder
// Automatically instrument ASP.NET Core requests
.AddAspNetCoreInstrumentation()
// Add custom source for tracing
.AddSource(Build.Name)
// Here we enabled all Azure.* sources,
// but added filter to drop HTTP client activities that would be duplicates of Azure activities.
.AddSource("Azure.*")
.AddHttpClientInstrumentation(o =>
{
o.FilterHttpRequestMessage = (_) => Activity.Current?.Parent?.Source?.Name != "Azure.Core.Http"; ;
})
.AddAzureMonitorTraceExporter(o =>
{
o.ConnectionString = configuration.ApplicationInsights.ConnectionString;
});
})
.WithMetrics(
meterProviderBuilder => meterProviderBuilder
.AddAzureMonitorMetricExporter(
x => x.ConnectionString = configuration.ApplicationInsights.ConnectionString)
.AddProcessInstrumentation()
.AddRuntimeInstrumentation()
.AddMeter(
"Microsoft.AspNetCore.Server.Kestrel",
Build.Name));
@FenkoXD, can you share a small sample app? You can either attach a zip or create a new repo. I'll need to debug this and discuss with the other team.
@TimothyMothra , yes here I`ve created minimum required app for reproduction repo I added ActivityFilteringProcessor - it is not filtering anything but I left a comment how to catch problematic Activity in this class You can just change appsettings.json to any service bus and appInsights connection string Then run app and trigger Test Get method on Swagger.
I filtered problematic activity, but not sure is it good work around :)
public class ServiceBusTaskCancelFilteringProcessor : BaseProcessor<Activity>
{
public override void OnEnd(Activity activity)
{
// Bypass export if the Filter returns false.
if (activity.Source.Name.Contains("Azure.Messaging.ServiceBus.ServiceBusReceiver"))
{
if (activity.TagObjects.Any(tag => tag is { Key: "error.type", Value: string and "System.Threading.Tasks.TaskCanceledException" }))
{
activity.ActivityTraceFlags &= ~ActivityTraceFlags.Recorded;
}
}
}
}
workaround looks fine. :) pro tip: add a comment referencing this github issue so you know why this workaround was necessary and when it can be removed.
Thanks for sharing the repro, please give me a few days to take a look. I might not get to this until Monday.
Library name and version
Azure.Messaging.ServiceBus [7.18.1] ^
Describe the bug
Calling method StopProcessingAsync on ServiceBusProcessor exported to Azure Application insights as a Dependency failure with raised exception System.Threading.Tasks.TaskCanceledException: A task was canceled.
Expected behavior
Expected await _processor.StopProcessingAsync() will finish ongoing task and pause receiving messages from topic without any fail reported.
Actual behavior
Calling method StopProcessingAsync on ServiceBusProcessor exported to Azure Application insights as a Dependency failure with raised exception System.Threading.Tasks.TaskCanceledException: A task was canceled.
Reproduction Steps
services.AddAzureClients(x => { x.AddServiceBusClient(serviceBusConfig.ConnectionString); });
builder.Services .AddOpenTelemetry() .ConfigureResource( resourceBuilder => resourceBuilder .AddService(serviceName: Build.Name, serviceInstanceId: Environment.MachineName)) .WithMetrics( meterProviderBuilder => meterProviderBuilder .AddAzureMonitorMetricExporter( x => x.ConnectionString = configuration.ApplicationInsights.ConnectionString) .AddProcessInstrumentation() .AddRuntimeInstrumentation() .AddMeter( "Microsoft.AspNetCore.Server.Kestrel", Build.Name));
public class SampleMessagesReceiver() { private ServiceBusProcessor? _pnBaselineProcessor; SampleMessagesReceiver(ServiceBusClient serviceBusClient) { _processor = _serviceBusClient.CreateProcessor( _serviceBusConfig.ChangepointBaselineTopicName, _serviceBusConfig.ChangepointBaselineSubscription, new ServiceBusProcessorOptions());
}
public async Task PauseMessageProcessing(CancellationToken cancellationToken) { if (_processor== null) return; try await _processor.StopProcessingAsync(cancellationToken);
catch (Exception e) _logger.LogError(e, "Error during try to pause massages processing");
} }
Kernel Version: 5.15.0-1057-azure OS Image: Ubuntu 22.04.4 LTS Operating System: linux Architecture: amd64 Container Runtime Version: containerd://1.7.7-1 Kubelet Version: v1.28.3 Kube-Proxy Version: v1.28.3