Closed Eneuman closed 5 months ago
@Eneuman, Thanks for reporting this issue. It looks like this is something new. I will need to spend some time to understand the setup to see if the current implementation is going to work or not:
The instrumentation key is stored in AzureAppConfig and the application is using Workload identity to access it.
When there's an update or if I need more info, I'll post a reply here.
The config setup looks like this
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureAppConfiguration(config =>
{
config.AddJsonFile("config/appsettings.json", optional: true);
})
.ConfigureWebHostDefaults(webBuilder =>
webBuilder.ConfigureAppConfiguration((hostingContext, config) =>
{
var settings = config.Build();
config.AddAzureAppConfiguration(options =>
{
options.Connect(new Uri(settings["AppConfig:Endpoint"]!), new DefaultAzureCredential());
options.ConfigureKeyVault(kv =>
{
kv.SetCredential(new DefaultAzureCredential());
});
});
})
.UseStartup<Startup>());
Hi @Eneuman ,
I read the callstack, and the error message is: "Could not get the stamp id. Aborting the upload process. ". Based on that, I think you have configured the iKey correctly for the Profiler. Otherwise, it would be something like Please make sure the instrumentation key is valid.
Now, the part I have no clue, about is why the stamp id is not returned properly.
Let's troubleshoot that a bit.
One possible thing is authorization:
Another possible issue is networking/firewall:
Could you please replace the iKey below and try to see if you could hit this endpoint, and what's the response?
HTTP GET: https://agent.azureserviceprofiler.net/api/profileragent/v4/stampid?iKey=<your-ikey>&machineName=troubleshooting
It would also help if I can check the backend telemetry. Could you please send over your instrumentation key to serviceprofilerhelp@microsoft.com
, mention the url to this issue, and I will be able to look for the telemetries on the requests if they reach our backend.
Thanksl
Hi!
Local authentication is enabled.
I think the wget was successfull This is the result:
Connecting to agent.azureserviceprofiler.net (20.100.7.5:443)
Connecting to westeurope.agent.azureserviceprofiler.net:443 (40.113.176.133:443)
saving to 'stampid?iKey=REDACTED&machineName=troubleshooting'
stampid?iKey=REDACTED 100% |************************************************************************| 24 0:00:00 ETA
'stampid?iKey=REDACTED&machineName=troubleshooting' saved
I have mailed you the iKey :)
Hi @Eneuman, I got your iKey, but I can't find any related telemetries:
We did have some high sampling(dropping telemetries) last week. Do you mind run your application again, try to upload the profile a couple of times to see if that works for you? And then I will check the telemetry again.
At the same time, I will need to check something else - maybe it is the uploader that is staled somehow.
Hi
I have now send about 15 more gets :)
Hey @Eneuman, now I see what is going on, let me explain. According to the telemetry:
The profiler client is supposed to follow the redirect (307) automatically, but it isn't. A potential cause is that a newer Azure.Core
package that we rely on introduced a break change lately to stop automatically following redirects. That is a known issue. That could explain the behavior (and why wget works). What I am not sure about is what brings in the new version.
Do you mind sharing the project.assets.json
in your startup project (after a restore), under the folder of obj
? I can do an analysis there to see which one brings in it, and see if there's a potential fix
At the same time, this could be your workaround for now, add a setting to the appsettings.json
:
{
"ServiceProfiler": {
"Endpoint": "https://westeurope.agent.azureserviceprofiler.net/"
}
}
Or, in your app configure store, I believe the settings could be formatted as:
ServiceProfiler:Endpoint = https://westeurope.agent.azureserviceprofiler.net/
Hi I think you found the problem :) Adding the key to AppConfiguration solved the problem. Here is the file your requested.
Redacted
Hey @Eneuman,
This is a dependency analysis, Azure.Core
is set to 1.35.0
due to multiple dependencies:
The redirect issue I mentioned was introduced at around 1.33.0
. And I think that is why the auto-redirect is disabled.
We have a fix for that in our codebase already. I'll get the beta out soon and let you know.
Hi @Eneuman, Please try out the new release and see if that works without the workaround: https://www.nuget.org/packages/Microsoft.ApplicationInsights.Profiler.AspNetCore/2.6.0-beta1
Thanks again for reporting the issue.
What's the ETA on the Microsoft.ApplicationInsights.Profiler.AspNetCore 2.6.0 release with this fix in place?
I am having the same problem and tried the v2.6.0-beta2 but then I got another error just like in this other issue:
[18:51:37 ERR] Trace upload failed. Exit code: 137
[18:51:37 ERR] Unexpected error happens on stopping service profiler.
Microsoft.ApplicationInsights.Profiler.Core.IPC.UnsupportedPayloadTypeException: Can't deserialize message over the named pipe.
at Microsoft.ApplicationInsights.Profiler.Core.IPC.DuplexNamedPipeService.ReadAsync[T](TimeSpan timeout, CancellationToken cancellationToken)
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.<>c__DisplayClass4_0.<<PostStopProcessAsync>b__0>d.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.PostStopProcessAsync(PostStopOptions e, CancellationToken cancellationToken)
[18:51:37 ERR] Unexpected error happens on stopping service profiler tracing.
Microsoft.ApplicationInsights.Profiler.Core.IPC.UnsupportedPayloadTypeException: Can't deserialize message over the named pipe.
at Microsoft.ApplicationInsights.Profiler.Core.IPC.DuplexNamedPipeService.ReadAsync[T](TimeSpan timeout, CancellationToken cancellationToken)
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.<>c__DisplayClass4_0.<<PostStopProcessAsync>b__0>d.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.PostStopProcessAsync(PostStopOptions e, CancellationToken cancellationToken)
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.StopServiceProfilerAsync(IProfilerSource source, CancellationToken cancellationToken)
[18:51:37 ERR] Unexpected exception in the last cycle.
Microsoft.ApplicationInsights.Profiler.Core.IPC.UnsupportedPayloadTypeException: Can't deserialize message over the named pipe.
at Microsoft.ApplicationInsights.Profiler.Core.IPC.DuplexNamedPipeService.ReadAsync[T](TimeSpan timeout, CancellationToken cancellationToken)
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.<>c__DisplayClass4_0.<<PostStopProcessAsync>b__0>d.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.PostStopProcessAsync(PostStopOptions e, CancellationToken cancellationToken)
at Microsoft.ApplicationInsights.Profiler.Core.ServiceProfilerProvider.StopServiceProfilerAsync(IProfilerSource source, CancellationToken cancellationToken)
at Microsoft.ApplicationInsights.Profiler.Core.Orchestration.OrchestratorEventPipe.StopProfilingAsync(SchedulingPolicy policy, CancellationToken cancellationToken)
at Microsoft.ServiceProfiler.Orchestration.SchedulingPolicy.StartPolicyAsync(CancellationToken cancellationToken)
I was having the same issue, and the beta package fixed it for me.
Hi @KirkMunroSagent, the release is in the pipe. Expect it by early next week. @bitbound Thanks for the feedback. @rasert I would suggest try it on the 2.6.0 once it is released and if the problem persistent, feel free to file another bug for investigation.
I will post the package info once it is out there.
The stable package is released: https://www.nuget.org/packages/Microsoft.ApplicationInsights.Profiler.AspNetCore/2.6.0
Describe the bug I'm running the profiler in a container in AKS. About 2 minutes fter ethe container has started i receive the following error in the logs:
Application insight is working fine and are reporting the same exception so I know the instumentationkey is working.
Desktop (please complete the following information):
The instrumentation key is stored in AzureAppConfig and the application is using Workload identity to access it.
I have been searching everywhere for an answer and I just can't figure out why it's not working when AI is working fine.