microsoft / ApplicationInsights-dotnet

ApplicationInsights-dotnet
MIT License
565 stars 287 forks source link

ApplicationInsights 2.6.4: Exception: System.OutOfMemoryException in Azure WebApps #856

Closed Leftyx closed 4 years ago

Leftyx commented 6 years ago

Repro Steps

  1. Add nuget package Microsoft.ApplicationInsights
  2. Publish to Azure Web App

Actual Behavior

After an upgrade to ApplicationInsights 2.6.4 we have started to notice these random errors in Azure WebApps. It wasn't happening before, with a previous SDK

System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.&#xD;&#xA; at Microsoft.ApplicationInsights.Extensibility.AutocollectedMetricsExtractor.EnsureItemNotSampled(ITelemetry item)&#xD;&#xA; at Microsoft.ApplicationInsights.Extensibility.AutocollectedMetricsExtractor.ExtractMetrics(ITelemetry fromItem)&#xD;&#xA; at Microsoft.ApplicationInsights.Extensibility.AutocollectedMetricsExtractor.Process(ITelemetry item)&#xD;&#xA; at Microsoft.ApplicationInsights.Extensibility.PerfCounterCollector.QuickPulse.QuickPulseTelemetryProcessor.Process(ITelemetry telemetry)&#xD;&#xA; at Microsoft.ApplicationInsights.TelemetryClient.Track(ITelemetry telemetry)&#xD;&#xA; at Microsoft.ApplicationInsights.TelemetryClient.TrackRequest(RequestTelemetry request)&#xD;&#xA; at Microsoft.ApplicationInsights.Web.RequestTrackingTelemetryModule.OnEndRequest(HttpContext context)&#xD;&#xA; at Microsoft.ApplicationInsights.Web.AspNetDiagnosticTelemetryModule.AspNetEventObserver.OnNext(KeyValuePair2 value) at System.Diagnostics.DiagnosticListener.Write(String name, Object value) at System.Diagnostics.DiagnosticSource.StopActivity(Activity activity, Object args) at Microsoft.AspNet.TelemetryCorrelation.ActivityHelper.StopAspNetActivity(Activity activity, IDictionary contextItems) at Microsoft.AspNet.TelemetryCorrelation.TelemetryCorrelationHttpModule.Application_EndRequest(Object sender, EventArgs e) at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.<>cDisplayClass285_0.<ExecuteStepImpl>b0() at System.Web.HttpApplication.StepInvoker.Invoke(Action executionStep) at System.Web.HttpApplication.StepInvoker.<>cDisplayClass4_0.<Invoke>b0() at Microsoft.AspNet.TelemetryCorrelation.TelemetryCorrelationHttpModule.OnExecuteRequestStep(HttpContextBase context, Action step) at System.Web.HttpApplication.<>c__DisplayClass284_0.<OnExecuteRequestStep>b__0(Action nextStepAction) at System.Web.HttpApplication.StepInvoker.Invoke(Action executionStep) at System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep step) at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) `

Expected Behavior

No exceptions

Version Info

SDK Version : 2.6.4 .NET Version : 4.6.2 + ASP.NET MVC 5.2.6 How Application was onboarded with SDK(VisualStudio/StatusMonitor/Azure Extension) : VisualStudio OS : Windows Hosting Info (IIS/Azure WebApps/ etc) : Azure WebApps

Dmitry-Matveev commented 6 years ago

@Leftyx, thanks for reporting!

We do not seem to have a repro of this issue at the moment, can you please collect couple dumps X minutes apart? (Where X is enough to avoid the failure but big enough to notice memory uptick).

Dump at the moment of the issue would be great, but I doubt you can collect those in App Services environment without bells and whistles.

Related references: Manually collecting memory dumps on App Services Remote debug Azure App Service

P.S. The downloading of the memory dump will contribute towards the outbound bandwidth of the App service plan. Please ZIP the dumps to avoid increased charges.

Leftyx commented 6 years ago

Thanks for your swift reply. I am afraid I won't be able to collect dumps on Azure for the Web App cause those exceptions happen randomly and mostly in production where we have high traffic. We've also noticed that it happens after a slot swap.

This is another failure after a recent swap:

System.OutOfMemoryException:
   at System.Web.Util.StringUtil.GetNullTerminatedByteArray (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.Hosting.IIS7WorkerRequest.SetUnknownResponseHeader (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpHeaderCollection.SetHeader (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpHeaderCollection.Set (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at NWebsec.Helpers.HeaderResultHandler.HandleHeaderResult (NWebsec, Version=4.3.0.0, Culture=neutral, PublicKeyToken=3613da5f958908a1)
   at NWebsec.Helpers.ConfigurationHeaderSetter.SetCspHeaders (NWebsec, Version=4.3.0.0, Culture=neutral, PublicKeyToken=3613da5f958908a1)
   at NWebsec.Helpers.ConfigurationHeaderSetter.SetContentRelatedHeadersFromConfig (NWebsec, Version=4.3.0.0, Culture=neutral, PublicKeyToken=3613da5f958908a1)
   at NWebsec.Modules.HttpHeaderSecurityModule.AppPostMapRequestHandler (NWebsec, Version=4.3.0.0, Culture=neutral, PublicKeyToken=3613da5f958908a1)
   at System.Web.HttpApplication+SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpApplication+<>c__DisplayClass285_0.<ExecuteStepImpl>b__0 (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpApplication+StepInvoker.Invoke (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpApplication+StepInvoker+<>c__DisplayClass4_0.<Invoke>b__0 (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at Microsoft.AspNet.TelemetryCorrelation.TelemetryCorrelationHttpModule.OnExecuteRequestStep (Microsoft.AspNet.TelemetryCorrelation, Version=1.0.3.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
   at System.Web.HttpApplication+<>c__DisplayClass284_0.<OnExecuteRequestStep>b__0 (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpApplication+StepInvoker.Invoke (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpApplication.ExecuteStepImpl (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Web.HttpApplication.ExecuteStep (System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)

It might be related to another issue I've reported recently which still happens every 10 minutes or so.

Dmitry-Matveev commented 6 years ago

I assigned the upcoming milestone to try reproducing the issue on our side. If something was piling up memory due to the related issue you mentioned - then, it's possible that latest 2.7 might help. It does not look like #855 is the reason for this particular OutOfMemoryException, though...

What is the typical load in RPS on the application with the issue and an approx. amount of outgoing calls per request?

cijothomas commented 4 years ago

Closing old issue with no repro to investigate.

sonalinoolkar commented 3 years ago

I am facing similar OutOfMemory Exception after upgrading to 2.16.0.