microsoft / ApplicationInsights-dotnet

ApplicationInsights-dotnet
MIT License
565 stars 287 forks source link

DependencyMetricsExtractor.ExtractMetrics threw exceptions #549

Closed northtyphoon closed 6 years ago

northtyphoon commented 7 years ago

We observed the behavior in some ETL traces we collected. I attached one sample call stack.

"ExceptionMessage Cannot execute ExtractMetrics because this metrics extractor has not been initialized (no metrics manager)."

 + Thread (9800) CPU=72ms (.NET ThreadPool)
  + ntdll!?
   + kernel32!?
    + clr!Thread::intermediateThreadProc
     + clr!ThreadpoolMgr::ExecuteWorkRequest
      + clr!UnManagedPerAppDomainTPCount::DispatchWorkItem
       + picohelper!?
        + webengine4!?
         + ?!?
          + clr!UM2MDoADCallBack
           + clr!??Thread::DoADCallBack
            + clr!UM2MThunk_Wrapper
             + ?!?
              + system.web!PipelineRuntime.ProcessRequestNotification
               + system.web!PipelineRuntime.ProcessRequestNotificationHelper
                + system.web!dynamicClass.IL_STUB_PInvoke(int,value class System.Web.RequestNotificationStatus&)
                 + webengine4!?
                  + iiscore!?
                   + webengine4!?
                    + ?!?
                     + system.web!PipelineRuntime.ProcessRequestNotification
                      + system.web!PipelineRuntime.ProcessRequestNotificationHelper
                       + system.web!HttpRuntime.ProcessRequestNotificationPrivate
                        + system.web!HttpApplication.BeginProcessRequestNotification
                         + system.web!System.Web.HttpApplication+PipelineStepManager.ResumeSteps(class System.Exception)
                          + system.web!HttpApplication.ExecuteStep
                           + system.web!System.Web.HttpApplication+CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
                            + system.web.mvc!IHttpAsyncHandler.EndProcessRequest
                             + system.web.mvc!MvcHandler.EndProcessRequest
                              + system.web.mvc!System.Web.Mvc.Async.AsyncResultWrapper+<>c__DisplayClass4.<MakeVoidDelegate>b__3(class System.IAsyncResult)
                               + system.web.mvc!System.Web.Mvc.MvcHandler+<>c__DisplayClass8.<BeginProcessRequest>b__3(class System.IAsyncResult)
                                + system.web.mvc!IAsyncController.EndExecute
                                 + system.web.mvc!Controller.EndExecute
                                  + system.web.mvc!System.Web.Mvc.Async.AsyncResultWrapper+<>c__DisplayClass4.<MakeVoidDelegate>b__3(class System.IAsyncResult)
                                   + system.web.mvc!Controller.EndExecuteCore
                                    + system.web.mvc!System.Web.Mvc.Async.AsyncResultWrapper+<>c__DisplayClass4.<MakeVoidDelegate>b__3(class System.IAsyncResult)
                                     + system.web.mvc!System.Web.Mvc.Controller+<>c__DisplayClass1d.<BeginExecuteCore>b__18(class System.IAsyncResult)
                                      + system.web.mvc!AsyncControllerActionInvoker.EndInvokeAction
                                       + system.web.mvc!System.Web.Mvc.Async.AsyncResultWrapper+WrappedAsyncResult`1[System.Boolean].End()
                                        + system.web.mvc!System.Web.Mvc.Async.AsyncControllerActionInvoker+<>c__DisplayClass25.<BeginInvokeAction>b__22(class System.IAsyncResult)
                                         + system.web.mvc!System.Web.Mvc.Async.AsyncControllerActionInvoker+<>c__DisplayClass25+<>c__DisplayClass2a.<BeginInvokeAction>b__20()
                                          + system.web.mvc!AsyncControllerActionInvoker.EndInvokeActionMethodWithFilters
                                           + system.web.mvc!System.Web.Mvc.Async.AsyncControllerActionInvoker+<>c__DisplayClass37.<BeginInvokeActionMethodWithFilters>b__36(class System.IAsyncResult)
                                            + system.web.mvc!System.Web.Mvc.Async.AsyncControllerActionInvoker+<>c__DisplayClass4f.<InvokeActionMethodFilterAsynchronously>b__49()
                                             + system.web.mvc!System.Web.Mvc.Async.AsyncControllerActionInvoker+<>c__DisplayClass37+<>c__DisplayClass39.<BeginInvokeActionMethodWithFilters>b__33()
                                              + system.web.mvc!AsyncControllerActionInvoker.EndInvokeActionMethod
                                               + system.web.mvc!System.Web.Mvc.Async.AsyncResultWrapper+<>c__DisplayClass8`1[System.__Canon].<BeginSynchronous>b__7(class System.IAsyncResult)
                                                + system.web.mvc!System.Web.Mvc.Async.AsyncControllerActionInvoker+<>c__DisplayClass42.<BeginInvokeSynchronousActionMethod>b__41()
                                                 + system.web.mvc!ControllerActionInvoker.InvokeActionMethod
                                                  + anonymously hosted dynamicmethods assembly!dynamicClass.lambda_method(pMT: 19E0A4A4,pMT: 19DD4388,class System.Object[])
                                                   + fabrikamfiber.web!HomeController.Index
                                                    + fabrikamfiber.web!HomeController.InitializeMetaData
                                                     + fabrikamfiber.web!FabrikamFiberAzureStorage.GetStorageBlobData
                                                      + microsoft.windowsazure.storage!CloudBlobContainer.CreateIfNotExists
                                                       + microsoft.windowsazure.storage!CloudBlobContainer.CreateIfNotExists
                                                        + microsoft.windowsazure.storage!CloudBlobContainer.Create
                                                         + microsoft.windowsazure.storage!Executor.ExecuteSync
                                                          + system!HttpWebRequest.GetResponse
                                                           + mscorlib!DebuggerHiddenAttribute.ApplicationInsights_OnException
                                                            + microsoft.ai.dependencycollector!ProfilerHttpProcessing.OnExceptionForGetResponse
                                                             + microsoft.ai.dependencycollector!HttpProcessing.OnEnd
                                                              + microsoft.ai.dependencycollector!ClientServerDependencyTracker.EndTracking
                                                               + microsoft.applicationinsights!TelemetryClient.Track
                                                                + microsoft.ai.perfcountercollector!QuickPulseTelemetryProcessor.Process
                                                                 + microsoft.applicationinsights.exceptionexemplification!ExceptionExemplificationTelemetryProcessor.Process
                                                                  + microsoft.applicationinsights!AutocollectedMetricsExtractor.Process
                                                                   + microsoft.applicationinsights!AutocollectedMetricsExtractor.ExtractMetrics
                                                                    + microsoft.applicationinsights!DependencyMetricsExtractor.ExtractMetrics
                                                                     + clr!IL_Throw
                                                                      + clr!RaiseTheExceptionInternalOnly
                                                                       + ntdll!?
                                                                        + clr!COMPlusNestedExceptionHandler
                                                                         + clr!COMPlusFrameHandler
                                                                          + clr!CPFH_FirstPassHandler
                                                                           + clr!CPFH_RealFirstPassHandler
                                                                            + clr!Thread::StackWalkFrames
                                                                             + clr!Thread::StackWalkFramesEx
                                                                              + clr!COMPlusThrowCallback
                                                                               + clr!??ETW::ExceptionLog::ExceptionThrown
                                                                                + clr!CoTemplate_zzpqhh
                                                                                 + clr!EtwCallout
                                                                                  + EventData ExceptionType System.InvalidOperationException
                                                                                   + EventData ExceptionMessage Cannot execute ExtractMetrics because this metrics extractor has not been initialized (no metrics manager).
                                                                                    + Event Microsoft-Windows-DotNETRuntime/Exception/Start
SergeyKanzhelev commented 7 years ago

@northtyphoon did you by chance re-initialized the processors chain to add ExceptionExemplificationTelemetryProcessor from code?

CC: @macrogreg

northtyphoon commented 7 years ago

@pharring can you answer the question?

hallatore commented 7 years ago

I see the same issue with the default configuration. I noticed the errors in the Live Metrics Stream, but I can't find them in analytics.

image

SergeyKanzhelev commented 7 years ago

CC @macrogreg again to follow up

@hallatore do you do any custom telemetry processor chain building? Or you use ApplicationInsights.config?

macrogreg commented 7 years ago

@hallatore @northtyphoon @pharring This behavior is expected if you have done some custom initialization of the Telemetry Pipeline and something went wrong with that process. Could you please share details on whether and what specifically you might have done in that regard?

There are many things that could have gone wrong there and we need some more information to tell specifics. One (of many) things to consider is that if you happened to create the Telemetry Pipeline programmatically, you need to not forget to do something similar to the following for each Telemetry Processor:

TelemetryConfiguration telemetryConfig = // your Pipeline config that you built in some way

TelemetryProcessorChain processors = telemetryConfig.TelemetryProcessorChain;
foreach (ITelemetryProcessor processor in processors.TelemetryProcessors)
{
    ITelemetryModule m = processor as ITelemetryModule;
    if (m != null)
    {
        m.Initialize(telemetryConfig);
    }
}
hallatore commented 7 years ago

I have a custom TelemetryProcessor, but it's only appended in the ApplicationInsights.config after the MetricsProcessor. Everything else is default from the config.

pharring commented 7 years ago

In the early prototypes of Snapshot Collector (which was back then called Exception Exemplification), we used code (as opposed to metadata in ApplicationInsights.config) to add the processor to the TelemetryProcessorChain via the TelemetryProcessorChainBuilder. At that time, the processor didn't implement ITelemetryModule and we certainly did not have any code like the snippet @macrogreg posted.

It seems onerous on callers to expect them to re-initialize modules, like that. Shouldn't that happen inside .Build() on the TelemetryProcessorChainBuilder ? Furthermore, shouldn't .Build() be smart enough not to destroy/re-create already-initialized processors?

macrogreg commented 7 years ago

@pharring - This is a good point. We should look into the possibility of this more closely. However, as an immediate workaround in case that you do, indeed, build your chain programmatically - does the above recommendation help?

macrogreg commented 7 years ago

@hallatore - In that case there may be something wrong with the general module initialization logic. @SergeyKanzhelev or @Dmitry-Matveev - do you have any recommendations for a workaround?

hallatore commented 7 years ago

We are initializing ai from code with the following.

TelemetryConfiguration.Active.TelemetryProcessorChainBuilder
    .Use((tp) => new IgnorePostDeployPrimingTelemetryProcessor(tp))
    .Build();

What is the correct code change to fix this?

macrogreg commented 7 years ago

Hi @hallatore , Please take a look at my post from 19/May. In your case you can set TelemetryConfiguration telemetryConfig = TelemetryConfiguration.Active.

Please try running my sample after you invoke the .Build() method. This should fix the problem, but please let me know if it does not. :)

hallatore commented 7 years ago

This looks right @macrogreg?

TelemetryConfiguration.Active.TelemetryProcessorChainBuilder
    .Use((tp) => new IgnorePostDeployPrimingTelemetryProcessor(tp))
    .Build();

foreach (var processor in TelemetryConfiguration.Active.TelemetryProcessors)
{
    var telemetryModule = processor as ITelemetryModule;
    if (telemetryModule != null)
    {
        telemetryModule.Initialize(TelemetryConfiguration.Active);
    }
}
hallatore commented 7 years ago

The code above fixed it.

But it feels like this should be handled automatically.

cijothomas commented 7 years ago

@macrogreg will be fixing this in 2.5.0-beta1

macrogreg commented 7 years ago

@cijothomas - in regard to the comment above - what we will be fixing in 2.5-betaX is that the diagnostic info is logged without the need for a first chance exception. However, I think that the folks on this thread are asking for is that we improve the programmatic building of the pipeline such that the telemetryModule.Initialize(..) calls described earlier are not necessary. While I am in full support of such an upgrade, it is not a planned work item for 2.5-betaX from my perspective. The Core SDK team needs to consider how it fits into the team's priorities.

Leonardo-Ferreira commented 7 years ago

I just updated from 2.3 to 2.4 and this started posing all over my dev env... other than this msg, everything seems to be working fine... will this msg also pop in prod env or just while DeveloperMode is true?

andreujuanc commented 7 years ago

I'm having the same issue: AI (Internal): [msg=Log Error];[msg=Error in RequestMetricsExtractor: System.InvalidOperationException: Cannot execute ExtractMetrics because this metrics extractor has not been initialized (no metrics manager). at Microsoft.ApplicationInsights.Extensibility.Implementation.Metrics.DependencyMetricsExtractor.ExtractMetrics(ITelemetry fromItem, Boolean& isItemProcessed) at Microsoft.ApplicationInsights.Extensibility.AutocollectedMetricsExtractor.ExtractMetrics(ITelemetry fromItem)]

No idea what can it be. My references are up to date. Suggestions?

Leonardo-Ferreira commented 7 years ago

@andreujuanc i updated to 2.4.1 and it's all gone

vmachacek commented 7 years ago

There is some serious problem with my configuration after upgrading to 2.4 version, not only AI (Internal) errors showed up in the appinsights from all of a sudden. But the image uploading stopped working (I'm using umbraco CMS with Azure Blong storage). It took me quite some time to find out, I was able to trace it back to update of Appisnights references.

version 2.2 AppinsightsConfig - https://pastebin.com/raw/ngQV7pR9 vesrion 2.4 AppinsightsConfig - https://pastebin.com/raw/Y4YB3Mnw

I aslo use this configuration during stratup

        Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration.Active.InstrumentationKey =
            ConfigurationManager.AppSettings["InstrumentationKey"];
        Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration.Active.TelemetryInitializers.Add(
            new AppInsightsInitializer());

        var builder = Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration.Active.TelemetryProcessorChainBuilder;

        builder.Use(next => new NoSqlDependencies(next));
        builder.Use(next => new NoFastDependencies(next));

        builder.Build();

To eliminate fast dependencies, and also to eliminate SQL dependencies, because of the monthly data cap in azure free tier.

Leonardo-Ferreira commented 7 years ago

@vmachacek happened to me too... i figured out that the update (when done via application insight integrated vs search) sets your telemetry key on the shared "layout.cshtml", web.config, applicationinsightsConfig... you need to reset all of those by yourself... after reseting, it all went back to normal

ohadschn commented 7 years ago

The code snippet suggested by @macrogreg resolved the issue for me as well. Here's a slightly more elegant version:

public static void InitializeProcessorModules()
{
    var telemetryConfig = TelemetryConfiguration.Active;
    foreach (var module in telemetryConfig.TelemetryProcessors.OfType<ITelemetryModule>())
    {
      module.Initialize(telemetryConfig);
    }
}
TimothyMothra commented 6 years ago

@SergeyKanzhelev, can you make the architecture decision: Should we attempt to auto-initialize all Processors as Modules? Or vice versa?

SergeyKanzhelev commented 6 years ago

Yes, we should re-initialize all processors on the chain re-build