aspnet / IISIntegration

[Archived] ASP.NET Core IIS integration. Project has moved to https://github.com/aspnet/AspNetCore
Apache License 2.0
148 stars 59 forks source link

ASP.NET Core 1.1 app suddenly serves 500 errors #365

Closed razzemans closed 7 years ago

razzemans commented 7 years ago

We have a few .NET Core 1.1 applications running in production on Azure (web app). We're usually running on multiple instances. Quite frequently (depending on load, it can happen at least once every day or once every 2 weeks) an instance will suddenly start serving 500 errors exclusively, failing rather badly with a simple black-on-white page:

The specified CGI application encountered an error and the server terminated the process.

It won't even show a custom 500 error page. It will fail hard. The only solution we have now is extensive monitoring and restarting the affected instance (which resolves the issue). Note that really only one instance is affected. The other instances will continue to serve requests without any issue. The issue sometimes also happens when restarting our app. On, or after startup, it will immediately start serving errors.

Now I'm having quite some issues debugging this issue. What I have tried:

When the isssue happens, the network bytes written will see a spike:

cgi_graphs_anon

I have an example failed request log to share, and eventlog, but not publicly :) Any help is appreciated.

Tratcher commented 7 years ago
  1. Check the event logs, though they likely say the process crashed.
  2. Enable console logging in your app, then enable stdout logging in AspNetCoreModule.
razzemans commented 7 years ago
  1. The eventlog just shows meaningless data to me, like this (not sure if this is during the actual issue)
<Event>
<System>
<Provider Name="W3SVC-WP"/>
<EventID>2289</EventID>
<Level>2</Level>
<Task>0</Task>
<Keywords>Keywords</Keywords>
<TimeCreated SystemTime="2017-04-28T05:36:41Z"/>
<EventRecordID>121618296</EventRecordID>
<Channel>Application</Channel>
<Computer>RD00155D589595</Computer>
<Security/>
</System>
<EventData>
<Data>30</Data>
<Data>5</Data>
<Data>\\?\D:\home\LogFiles\W3SVC268986724\</Data>
<Binary>02000780</Binary>
</EventData>
</Event>
  1. Yes I figured I have to enable that. I will during the next release, which is most likely Monday. Hope to get some more info then.
amirjalali1 commented 7 years ago

I have faced the same problem and i have logged everything i mean everything i couldn't find anything

razzemans commented 7 years ago

I am still working with Azure support on this issue. They have identified that we use too many outbound connections. Which is quite strange as we are using components correctly (e.g .singleton HttpClient, injecting DbContext as scoped, just as described in docs everywhere). We also never had this issue in .NET 4.5 projects.

Even when having too many outbound connections I think it shouldn't fail with a CGI error but at least handle that request gracefully by serving a 500 error page.

Lutando commented 7 years ago

I am currently expiriencing the same issue using NetCore App 1.1. Our Asp Net Core application sounds similar to @razzemans as we have a singleton CosmosDB client that serves all our outbound connections for query.

I have looked at https://github.com/aspnet/IISIntegration/issues/311, https://github.com/aspnet/IISIntegration/issues/269, and https://github.com/aspnet/IISIntegration/issues/245#issuecomment-242541999 but I am still running into these issues

image

If you look at the image above, on may the 9th we switched this app from full framework to dotnet core and we are now running into this issue ever since.

We also seem to think that it is a thread issue because this happens under load, and the appservice complains about there being too high a thread count

image

Event logs hasnt pointed to anything specific, and even application logs have not pointed to anything specific either. it is pretty odd...

muratg commented 7 years ago

Assigning to @Tratcher for investigation.

Tratcher commented 7 years ago

Having a high thread count implies a resource deadlock. The best way to investigate is with a process dump so we can see where those threads are stuck. We recently investigated an issue like this in https://github.com/aspnet/DataProtection/issues/233.

razzemans commented 7 years ago

I will try to get a dump next time I run into this issue. Quite sure I can't use windbg on Azure but I hope the default Azure memory dump will be enough.

amirjalali1 commented 7 years ago

i got several dump files over a week it seems related to partial views and viewcomponents and idistributedcache Threads are waiting in my case for

System.Threading.ManualResetEventSlim.Wait

I'm not sure still because in release mode i can't access local arguments and variables and i'm not a pro at windbg but still i'm examining the problem. i'm doing further analysis

Lutando commented 7 years ago

Just an update from my side. We had rogue threading issues that were fixed with some subtle changes to our production code.

It required some memory leak analysis to find out. While I'm sure this problem still persists for others in this issue thread, mine was caused by application code and not from its dependency libraries.

Tratcher commented 7 years ago

@1amirjalai can you share the call stack?

amirjalali1 commented 7 years ago

here is the !clrstack. there is a distributed-cache which wraps a partial view which wraps a viewcomponent which makes an async query using dapper.

ntdll!NtWaitForMultipleObjects+14    
KERNELBASE!WaitForMultipleObjectsEx+ef    
coreclr!Thread::DoAppropriateWaitWorker+fe    
coreclr!Thread::DoAppropriateWait+7d    
coreclr!CLREventBase::WaitEx+7f    
coreclr!SyncBlock::Wait+194    
[[GCFrame]]    
coreclr!ObjectNative::WaitTimeout+d4    
[[HelperMethodFrame_1OBJ] (System.Threading.Monitor.ObjWait)] System.Threading.Monitor.ObjWait(Boolean, Int32, System.Object)    
System.Threading.ManualResetEventSlim.Wait(Int32, System.Threading.CancellationToken)+398    
System.Threading.Tasks.Task.SpinThenBlockingWait(Int32, System.Threading.CancellationToken)+b9    
System.Threading.Tasks.Task.InternalWait(Int32, System.Threading.CancellationToken)+1a2    
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(System.Threading.Tasks.Task)+25    
Microsoft.AspNetCore.Mvc.Rendering.HtmlHelperPartialExtensions.Partial(Microsoft.AspNetCore.Mvc.Rendering.IHtmlHelper, System.String, System.Object, Microsoft.AspNetCore.Mvc.ViewFeatures.ViewDataDictionary)+3c    
AspNetCore._Areas_Article_Views_Article_Index_cshtml+<ExecuteAsync>d__86.MoveNext()+2c2    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[AspNetCore._Areas_Article_Views_Article_Index_cshtml+<ExecuteAsync>d__86, fkzqs1mt.znn]](<ExecuteAsync>d__86 ByRef)+45    
AspNetCore._Areas_Article_Views_Article_Index_cshtml.ExecuteAsync()+7e    
Microsoft.AspNetCore.Mvc.Razor.RazorView+<RenderPageAsync>d__14.MoveNext()+367    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1[[System.__Canon, System.Private.CoreLib]].Start[[Microsoft.AspNetCore.Mvc.Razor.RazorView+<RenderPageAsync>d__14, Microsoft.AspNetCore.Mvc.Razor]](<RenderPageAsync>d__14 ByRef)+4d    
Microsoft.AspNetCore.Mvc.Razor.RazorView.RenderPageAsync(Microsoft.AspNetCore.Mvc.Razor.IRazorPage, Microsoft.AspNetCore.Mvc.Rendering.ViewContext, Boolean)+ac    
Microsoft.AspNetCore.Mvc.Razor.RazorView+<RenderAsync>d__13.MoveNext()+a6    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Mvc.Razor.RazorView+<RenderAsync>d__13, Microsoft.AspNetCore.Mvc.Razor]](<RenderAsync>d__13 ByRef)+45    
Microsoft.AspNetCore.Mvc.Razor.RazorView.RenderAsync(Microsoft.AspNetCore.Mvc.Rendering.ViewContext)+89    
Microsoft.AspNetCore.Mvc.ViewFeatures.ViewExecutor+<ExecuteAsync>d__18.MoveNext()+2e5    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Mvc.ViewFeatures.ViewExecutor+<ExecuteAsync>d__18, Microsoft.AspNetCore.Mvc.ViewFeatures]](<ExecuteAsync>d__18 ByRef)+45    
Microsoft.AspNetCore.Mvc.ViewFeatures.ViewExecutor.ExecuteAsync(Microsoft.AspNetCore.Mvc.ActionContext, Microsoft.AspNetCore.Mvc.ViewEngines.IView, Microsoft.AspNetCore.Mvc.ViewFeatures.ViewDataDictionary, Microsoft.AspNetCore.Mvc.ViewFeatures.ITempDataDictionary, System.String, System.Nullable`1<Int32>)+cc    
Microsoft.AspNetCore.Mvc.ViewFeatures.Internal.ViewResultExecutor.ExecuteAsync(Microsoft.AspNetCore.Mvc.ActionContext, Microsoft.AspNetCore.Mvc.ViewEngines.IView, Microsoft.AspNetCore.Mvc.ViewResult)+8a    
Microsoft.AspNetCore.Mvc.ViewResult+<ExecuteResultAsync>d__26.MoveNext()+e5    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Mvc.ViewResult+<ExecuteResultAsync>d__26, Microsoft.AspNetCore.Mvc.ViewFeatures]](<ExecuteResultAsync>d__26 ByRef)+45    
Microsoft.AspNetCore.Mvc.ViewResult.ExecuteResultAsync(Microsoft.AspNetCore.Mvc.ActionContext)+89    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeResultAsync>d__30.MoveNext()+88    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeResultAsync>d__30, Microsoft.AspNetCore.Mvc.Core]](<InvokeResultAsync>d__30 ByRef)+45    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.InvokeResultAsync(Microsoft.AspNetCore.Mvc.IActionResult)+89    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State ByRef, Scope ByRef, System.Object ByRef, Boolean ByRef)+1351    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeNextResultFilterAsync>d__28.MoveNext()+b0    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeNextResultFilterAsync>d__28, Microsoft.AspNetCore.Mvc.Core]](<InvokeNextResultFilterAsync>d__28 ByRef)+45    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.InvokeNextResultFilterAsync()+84    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State ByRef, Scope ByRef, System.Object ByRef, Boolean ByRef)+128e    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeNextResourceFilter>d__22.MoveNext()+b0    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeNextResourceFilter>d__22, Microsoft.AspNetCore.Mvc.Core]](<InvokeNextResourceFilter>d__22 ByRef)+45    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.InvokeNextResourceFilter()+84    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State ByRef, Scope ByRef, System.Object ByRef, Boolean ByRef)+583    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeAsync>d__20.MoveNext()+1f3    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker+<InvokeAsync>d__20, Microsoft.AspNetCore.Mvc.Core]](<InvokeAsync>d__20 ByRef)+45    
Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.InvokeAsync()+9b    
Microsoft.AspNetCore.Builder.RouterMiddleware+<Invoke>d__4.MoveNext()+2e4    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Builder.RouterMiddleware+<Invoke>d__4, Microsoft.AspNetCore.Routing]](<Invoke>d__4 ByRef)+45    
Microsoft.AspNetCore.Builder.RouterMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+89    
NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase+<Invoke>d__2.MoveNext()+68    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase+<Invoke>d__2, NWebsec.AspNetCore.Middleware]](<Invoke>d__2 ByRef)+45    
NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase.Invoke(Microsoft.AspNetCore.Http.HttpContext)+83    
NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase+<Invoke>d__2.MoveNext()+68    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase+<Invoke>d__2, NWebsec.AspNetCore.Middleware]](<Invoke>d__2 ByRef)+45    
NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase.Invoke(Microsoft.AspNetCore.Http.HttpContext)+83    
NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase+<Invoke>d__2.MoveNext()+68    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase+<Invoke>d__2, NWebsec.AspNetCore.Middleware]](<Invoke>d__2 ByRef)+45    
NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase.Invoke(Microsoft.AspNetCore.Http.HttpContext)+83    
Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+21b    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]].MoveNext()+2e1    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]], Microsoft.AspNetCore.Authentication]](<Invoke>d__18<System.__Canon> ByRef)+7e    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1[[System.__Canon, System.Private.CoreLib]].Invoke(Microsoft.AspNetCore.Http.HttpContext)+b9    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]].MoveNext()+2e1    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]], Microsoft.AspNetCore.Authentication]](<Invoke>d__18<System.__Canon> ByRef)+7e    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1[[System.__Canon, System.Private.CoreLib]].Invoke(Microsoft.AspNetCore.Http.HttpContext)+b9    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]].MoveNext()+2e1    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]], Microsoft.AspNetCore.Authentication]](<Invoke>d__18<System.__Canon> ByRef)+7e    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1[[System.__Canon, System.Private.CoreLib]].Invoke(Microsoft.AspNetCore.Http.HttpContext)+b9    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]].MoveNext()+2e1    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1+<Invoke>d__18[[System.__Canon, System.Private.CoreLib]], Microsoft.AspNetCore.Authentication]](<Invoke>d__18<System.__Canon> ByRef)+7e    
Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1[[System.__Canon, System.Private.CoreLib]].Invoke(Microsoft.AspNetCore.Http.HttpContext)+b9    
Microsoft.AspNetCore.Session.SessionMiddleware+<Invoke>d__9.MoveNext()+2a6    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Session.SessionMiddleware+<Invoke>d__9, Microsoft.AspNetCore.Session]](<Invoke>d__9 ByRef)+45    
Microsoft.AspNetCore.Session.SessionMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+89    
Microsoft.AspNetCore.Builder.RouterMiddleware+<Invoke>d__4.MoveNext()+1c9    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Builder.RouterMiddleware+<Invoke>d__4, Microsoft.AspNetCore.Routing]](<Invoke>d__4 ByRef)+45    
Microsoft.AspNetCore.Builder.RouterMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+89    
Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware+<Invoke>d__6.MoveNext()+76    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware+<Invoke>d__6, Microsoft.AspNetCore.Diagnostics]](<Invoke>d__6 ByRef)+45    
Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+8f    
Microsoft.AspNetCore.Diagnostics.StatusCodePagesMiddleware+<Invoke>d__3.MoveNext()+ba    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Diagnostics.StatusCodePagesMiddleware+<Invoke>d__3, Microsoft.AspNetCore.Diagnostics]](<Invoke>d__3 ByRef)+45    
Microsoft.AspNetCore.Diagnostics.StatusCodePagesMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+89    
Microsoft.AspNetCore.Server.IISIntegration.IISMiddleware+<Invoke>d__8.MoveNext()+3c5    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Server.IISIntegration.IISMiddleware+<Invoke>d__8, Microsoft.AspNetCore.Server.IISIntegration]](<Invoke>d__8 ByRef)+45    
Microsoft.AspNetCore.Server.IISIntegration.IISMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+89    
Microsoft.AspNetCore.Hosting.Internal.RequestServicesContainerMiddleware+<Invoke>d__3.MoveNext()+205    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Hosting.Internal.RequestServicesContainerMiddleware+<Invoke>d__3, Microsoft.AspNetCore.Hosting]](<Invoke>d__3 ByRef)+45    
Microsoft.AspNetCore.Hosting.Internal.RequestServicesContainerMiddleware.Invoke(Microsoft.AspNetCore.Http.HttpContext)+89    
Microsoft.AspNetCore.Hosting.Internal.HostingApplication.ProcessRequestAsync(Context)+39    
Microsoft.AspNetCore.Server.Kestrel.Internal.Http.Frame`1+<RequestProcessingAsync>d__2[[Microsoft.AspNetCore.Hosting.Internal.HostingApplication+Context, Microsoft.AspNetCore.Hosting]].MoveNext()+63e    
System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.AspNetCore.Server.Kestrel.Internal.Http.Frame`1+<RequestProcessingAsync>d__2[[Microsoft.AspNetCore.Hosting.Internal.HostingApplication+Context, Microsoft.AspNetCore.Hosting]], Microsoft.AspNetCore.Server.Kestrel]](<RequestProcessingAsync>d__2<Context> ByRef)+45    
Microsoft.AspNetCore.Server.Kestrel.Internal.Http.Frame`1[[Microsoft.AspNetCore.Hosting.Internal.HostingApplication+Context, Microsoft.AspNetCore.Hosting]].RequestProcessingAsync()+ad    
System.Threading.Tasks.Task`1[[System.__Canon, System.Private.CoreLib]].InnerInvoke()+9b    
System.Threading.Tasks.Task.Execute()+49    
System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)+97    
System.Threading.Tasks.Task.ExecuteWithThreadLocal(System.Threading.Tasks.Task ByRef)+23e    
System.Threading.Tasks.Task.ExecuteEntry(Boolean)+a2    
System.Threading.ThreadPoolWorkQueue.Dispatch()+1e2    
coreclr!CallDescrWorkerInternal+83    
coreclr!MethodDescCallSite::CallTargetWorker+14e    
coreclr!QueueUserWorkItemManagedCallback+38    
coreclr!QueueUserWorkItemManagedCallback+2df    
coreclr!QueueUserWorkItemManagedCallback+234    
coreclr!QueueUserWorkItemManagedCallback+161    
[[DebuggerU2MCatchHandlerFrame]]    
coreclr!QueueUserWorkItemManagedCallback+7b    
coreclr!ManagedPerAppDomainTPCount::DispatchWorkItem+8e    
coreclr!ThreadpoolMgr::WorkerThreadStart+188    
coreclr!Thread::intermediateThreadProc+8b    
kernel32!BaseThreadInitThunk+14    
ntdll!RtlUserThreadStart+21 
Tratcher commented 7 years ago

@rynowak Microsoft.AspNetCore.Mvc.Rendering.HtmlHelperPartialExtensions.Partial(Microsoft.AspNetCore.Mvc.Rendering.IHtmlHelper, System.String, System.Object, Microsoft.AspNetCore.Mvc.ViewFeatures.ViewDataDictionary)+3c

https://github.com/aspnet/Mvc-Private/blob/34e4fbf92df2d181b441bbbde598b68d4f33d8b4/src/Microsoft.AspNetCore.Mvc.ViewFeatures/Rendering/HtmlHelperPartialExtensions.cs#L234

amirjalali1 commented 7 years ago

@Lutando what steps did you take in order to analyze memory leak? did you used any special software? i'm using dotmemory from jetbrains but i don't see any memory leak.

Tratcher commented 7 years ago

Is there something about _Areas_Article_Views_Article_Index_cshtml that should be changed from calling Partial to PartialAsync?

amirjalali1 commented 7 years ago

@Tratcher

inside one of the html.partial is a call to

@await Component.InvokeAsync("ArticleDetailMostViewdArticles")

should html.partial be changed to html.partialasync in this case?

Tratcher commented 7 years ago

Sounds good to me, but I don't know those layers. Did you try it?

amirjalali1 commented 7 years ago

@Tratcher no i didn't try it. i just commented out the invokeasync section, so that i can find a fix for the issue.

but i can't reproduce the issue on development env so it's a bit confusing and i can't find where the problem is.

the code is simple something like this

index.cshmlt {

     <div>some html stuff</div>

      @html.partial {
                 <distributed-cache name="MyCache">
                                @await Component.InvokeAsync("ArticleDetailMostViewdArticles"){
                                       controller ====> dapper.queryasync
                                        view ======> foreach (var item in dappermodel){
                                                                               <div >
                                                                              @Html.Partial("_ArticleItem", item ){
                                                                                              <div>model.title</div>
                                                                               }
                                                                             </div>
                                           }
                                     }
                 </distributed-cache>

                  }
            }
amirjalali1 commented 7 years ago

@Tratcher

here html.partial calls html.partialasync at last.

    public static IHtmlContent Partial(
            this IHtmlHelper htmlHelper,
            string partialViewName,
            object model,
            ViewDataDictionary viewData)
        {
            if (htmlHelper == null)
            {
                throw new ArgumentNullException(nameof(htmlHelper));
            }

            if (partialViewName == null)
            {
                throw new ArgumentNullException(nameof(partialViewName));
            }

            var result = htmlHelper.PartialAsync(partialViewName, model, viewData);
            return result.GetAwaiter().GetResult();
        }

i don't think changing partial to partialasync would change anything

razzemans commented 7 years ago

We just ran into the issue again but it resolved itself it seems. I was unable to do a dump. However, the sdout logs show connection errors:

fail: Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware[0]
      An unhandled exception has occurred: Error -4077 ECONNRESET connection reset by peer
System.IO.IOException: Error -4077 ECONNRESET connection reset by peer ---> Microsoft.AspNetCore.Server.Kestrel.Internal.Networking.UvException: Error -4077 ECONNRESET connection reset by peer
   --- End of inner exception stack trace ---
   at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.SocketInput.CheckConnectionError()
   at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.SocketInputExtensions.PeekAsync(SocketInput input)
   at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.MessageBody.ForContentLength.PeekAsync(CancellationToken cancellationToken)
   at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.MessageBody.ReadAsync(ArraySegment`1 buffer, CancellationToken cancellationToken)
   at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.FrameRequestStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
   at System.IO.StreamReader.<ReadBufferAsync>d__102.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.IO.StreamReader.<ReadAsyncInternal>d__69.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.WebUtilities.FormReader.<BufferAsync>d__41.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.WebUtilities.FormReader.<ReadNextPairAsyncImpl>d__34.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.WebUtilities.FormReader.<ReadFormAsync>d__43.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task)
   at Microsoft.AspNetCore.Http.Features.FormFeature.<InnerReadFormAsync>d__18.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.ModelBinding.FormValueProviderFactory.<AddValueProviderAsync>d__1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.ModelBinding.CompositeValueProvider.<CreateAsync>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.Internal.DefaultControllerArgumentBinder.<BindArgumentsCoreAsync>d__6.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextExceptionFilterAsync>d__24.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Rethrow(ExceptionContext context)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextResourceFilter>d__22.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Rethrow(ResourceExecutedContext context)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeAsync>d__20.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Builder.RouterMiddleware.<Invoke>d__4.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Cors.Infrastructure.CorsMiddleware.<Invoke>d__7.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Diagnostics.StatusCodePagesMiddleware.<Invoke>d__3.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware.<Invoke>d__6.MoveNext()
fail: XXX.Common.Services.Caching.RedisCacheService[0]
      Unable to connect to Redis Cache
StackExchange.Redis.RedisConnectionException: No connection is available to service this operation: SETEX KEY; It was not possible to connect to the redis server(s); ConnectTimeout ---> System.AggregateException: One or more errors occurred. (It was not possible to connect to the redis server(s); ConnectTimeout) (UnableToConnect on xxx.xxx.cache.windows.net:6380/Interactive, origin: ResetNonConnected, input-buffer: 0, outstanding: 0, last-read: 79s ago, last-write: 79s ago, unanswered-write: 187024s ago, keep-alive: 60s, pending: 0, state: Connecting, last-heartbeat: never, last-mbeat: -1s ago, global: 0s ago) ---> StackExchange.Redis.RedisConnectionException: It was not possible to connect to the redis server(s); ConnectTimeout
   --- End of inner exception stack trace ---
   --- End of inner exception stack trace ---
   at StackExchange.Redis.ConnectionMultiplexer.ThrowFailed[T](TaskCompletionSource`1 source, Exception unthrownException) in c:\code\StackExchange.Redis\StackExchange.Redis\StackExchange\Redis\ConnectionMultiplexer.cs:line 2000
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at XXX.Common.Services.Caching.RedisCacheService.<SetAsync>d__10`1.MoveNext()
fail: XXX.Common.Services.Caching.RedisCacheService[0]
      Unable to connect to Redis Cache
StackExchange.Redis.RedisConnectionException: No connection is available to service this operation: GET KEY; It was not possible to connect to the redis server(s); ConnectTimeout ---> System.AggregateException: One or more errors occurred. (It was not possible to connect to the redis server(s); ConnectTimeout) (UnableToConnect on xxx.xxx.cache.windows.net:6380/Interactive, origin: ResetNonConnected, input-buffer: 0, outstanding: 0, last-read: 79s ago, last-write: 79s ago, unanswered-write: 187024s ago, keep-alive: 60s, pending: 0, state: Connecting, last-heartbeat: never, last-mbeat: -1s ago, global: 0s ago) ---> StackExchange.Redis.RedisConnectionException: It was not possible to connect to the redis server(s); ConnectTimeout
   --- End of inner exception stack trace ---
   --- End of inner exception stack trace ---
   at StackExchange.Redis.ConnectionMultiplexer.ThrowFailed[T](TaskCompletionSource`1 source, Exception unthrownException) in c:\code\StackExchange.Redis\StackExchange.Redis\StackExchange\Redis\ConnectionMultiplexer.cs:line 2000
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at XXX.Common.Services.Caching.RedisCacheService.<GetAsync>d__8`1.MoveNext()

Still hope to get more useful information soon.

amirjalali1 commented 7 years ago

I had a similar error today although i have issued this error on asp.net core 1.1.1 before #1814 here

System.IO.IOException: Error -4077 ECONNRESET connection reset by peer ---> Microsoft.AspNetCore.Server.Kestrel.Internal.Networking.UvException: Error -4077 ECONNRESET connection reset by peer --- End of inner exception stack trace --- at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.SocketInput.CheckConnectionError() at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.SocketInputExtensions.<PeekAsyncAwaited>d__3.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Runtime.CompilerServices.ValueTaskAwaiter`1.GetResult() at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.MessageBody.ForContentLength.<PeekAsyncAwaited>d__4.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Runtime.CompilerServices.ValueTaskAwaiter`1.GetResult() at Microsoft.AspNetCore.Server.Kestrel.Internal.Http.MessageBody.<ReadAsyncAwaited>d__12.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.IO.StreamReader.<ReadBufferAsync>d__102.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.IO.StreamReader.<ReadAsyncInternal>d__69.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.WebUtilities.FormReader.<BufferAsync>d__41.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.WebUtilities.FormReader.<ReadNextPairAsyncImpl>d__34.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.WebUtilities.FormReader.<ReadFormAsync>d__43.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Http.Features.FormFeature.<InnerReadFormAsync>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Antiforgery.Internal.DefaultAntiforgeryTokenStore.<GetRequestTokensAsync>d__3.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Antiforgery.Internal.DefaultAntiforgery.<ValidateRequestAsync>d__9.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Mvc.ViewFeatures.Internal.ValidateAntiforgeryTokenAuthorizationFilter.<OnAuthorizationAsync>d__3.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeAsync>d__20.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Builder.RouterMiddleware.<Invoke>d__4.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase.<Invoke>d__2.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase.<Invoke>d__2.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at NWebsec.AspNetCore.Middleware.Middleware.MiddlewareBase.<Invoke>d__2.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware`1.<Invoke>d__18.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Session.SessionMiddleware.<Invoke>d__9.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at Microsoft.AspNetCore.Session.SessionMiddleware.<Invoke>d__9.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Builder.RouterMiddleware.<Invoke>d__4.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware.<Invoke>d__6.MoveNext()
muratg commented 7 years ago

@Tratcher could you please summarize the issue?

razzemans commented 7 years ago

Meanwhile I was able to do a Memory dump on Azure. I have tried analyzing it myself but can't seem to get that to work. @Tratcher is there anyway someone can have a look and if yes, can I share it privately?

razzemans commented 7 years ago

So I'm a bit further and below is the output of using !analyze -v. I will try to get more info but I'm not experienced in using Windbg.

*******************************************************************************
*                                                                             *
*                        Exception Analysis                                   *
*                                                                             *
*******************************************************************************

*** WARNING: Unable to verify checksum for dotnet.exe

"D:\Program" was not found in the image list.
Debugger will attempt to load "D:\Program" at given base 00000000.

Please provide the full image name, including the extension (i.e. kernel32.dll)
for more reliable results.Base address and size overrides can be given as
.reload <image.ext>=<base>,<size>.
Unable to add module at 00000000
*** ERROR: Symbol file could not be found.  Defaulted to export symbols for ole32.dll - 
*** ERROR: Symbol file could not be found.  Defaulted to export symbols for combase.dll - 
*** WARNING: Unable to verify checksum for Microsoft.Extensions.Logging.AzureAppServices.dll
*** ERROR: Module load completed but symbols could not be loaded for Microsoft.Extensions.Logging.AzureAppServices.dll
GetUrlPageData2 (WinHttp) failed: 12002.

DUMP_CLASS: 2

DUMP_QUALIFIER: 400

FAULTING_IP: 
+0
00000000 ??              ???

EXCEPTION_RECORD:  (.exr -1)
ExceptionAddress: 00000000
   ExceptionCode: 80000003 (Break instruction exception)
  ExceptionFlags: 00000000
NumberParameters: 0

FAULTING_THREAD:  00000aa8

DEFAULT_BUCKET_ID:  BREAKPOINT_NOSOS

PROCESS_NAME:  dotnet.exe

ERROR_CODE: (NTSTATUS) 0x80000003 - {UITZONDERING}  Onderbrekingspunt  Er is een onderbrekingspunt bereikt.

EXCEPTION_CODE: (HRESULT) 0x80000003 (2147483651) - E n of meer argumenten zijn ongeldig

EXCEPTION_CODE_STR:  80000003

WATSON_BKT_PROCSTAMP:  57621878

WATSON_BKT_PROCVER:  1.0.1.4500

PROCESS_VER_PRODUCT:  Microsoft® .NET Core Framework

WATSON_BKT_MODULE:  unknown

WATSON_BKT_MODVER:  0.0.0.0

WATSON_BKT_MODOFFSET:  0

WATSON_BKT_MODSTAMP:  bbbbbbb4

BUILD_VERSION_STRING:  6.2.9200.16859 (win8_gdr.140228-1809)

MODLIST_WITH_TSCHKSUM_HASH:  a872fe4cb2122f09bd3e428bae9c9b8735e881d4

MODLIST_SHA1_HASH:  ce921169c123dc14e3a455f7c66042237c9836d1

NTGLOBALFLAG:  0

PROCESS_BAM_CURRENT_THROTTLED: 0

PROCESS_BAM_PREVIOUS_THROTTLED: 0

APPLICATION_VERIFIER_FLAGS:  0

PRODUCT_TYPE:  3

SUITE_MASK:  400

DUMP_FLAGS:  8000c07

DUMP_TYPE:  3

MISSING_CLR_SYMBOL: 0

ANALYSIS_SESSION_HOST:  ---

ANALYSIS_SESSION_TIME:  07-05-2017 14:26:54.0875

ANALYSIS_VERSION: 10.0.15063.468 x86fre

MANAGED_CODE: 1

MANAGED_ENGINE_MODULE:  coreclr

THREAD_ATTRIBUTES: 
ADDITIONAL_DEBUG_TEXT:  SOS.DLL is not loaded for managed code. Analysis might be incomplete

OS_LOCALE:  ENU

PROBLEM_CLASSES: 

    ID:     [0n300]
    Type:   [@APPLICATION_FAULT_STRING]
    Class:  Primary
    Scope:  DEFAULT_BUCKET_ID (Failure Bucket ID prefix)
            BUCKET_ID
    Name:   Omit
    Data:   Add
            String: [BREAKPOINT]
    PID:    [Unspecified]
    TID:    [Unspecified]
    Frame:  [0]

    ID:     [0n234]
    Type:   [NOSOS]
    Class:  Addendum
    Scope:  DEFAULT_BUCKET_ID (Failure Bucket ID prefix)
            BUCKET_ID
    Name:   Add
    Data:   Omit
    PID:    [Unspecified]
    TID:    [Unspecified]
    Frame:  [0]

BUGCHECK_STR:  BREAKPOINT_NOSOS

PRIMARY_PROBLEM_CLASS:  BREAKPOINT

LAST_CONTROL_TRANSFER:  from 75afc752 to 76f1081c

STACK_TEXT:  
00f4e648 75afc752 00000001 00f4e954 00000000 ntdll!NtWaitForMultipleObjects+0xc
WARNING: Stack unwind information not available. Following frames may be wrong.
00f4e7cc 725723a7 00000001 00000000 00000001 KERNELBASE!GetModuleHandleExW+0x5c
00f4e83c 72572134 00000001 00f4e954 00000001 coreclr!Thread::DoAppropriateWaitWorker+0xf9
00f4e8a8 7269d0a3 00000001 00f4e954 00000001 coreclr!Thread::DoAppropriateWait+0x64
00f4e980 72047865 00000000 00000000 02058154 coreclr!WaitHandleNative::CorWaitOneNative+0xd3
00f4e994 7204775c 00000000 ffffffff ffffffff System_Private_CoreLib_ni+0x427865
00f4e9b0 720477e6 00000000 00f4ea00 1e188869 System_Private_CoreLib_ni+0x42775c
00f4e9bc 1e188869 02057f54 05afee04 05af2948 System_Private_CoreLib_ni+0x4277e6
00f4ea00 1e188210 01bddba0 01bddb90 05afedd4 0x1e188869
00f4ea38 1b0408e7 00000000 00f4ea60 00f4ea54 0x1e188210
00f4ea48 727930cf 01318940 00f4ea88 7259db94 0x1b0408e7
00f4ea54 7259db94 00f4eacc 00f4ec54 725d7050 coreclr!CallDescrWorkerInternal+0x34
00f4ea88 7259de34 00f4eabc 01b0a6fc 00000000 coreclr!CallDescrWorkerWithHandler+0x60
00f4eaf8 72579db4 00f4ec2c 00000000 00000000 coreclr!MethodDescCallSite::CallTargetWorker+0x16f
00f4ec64 72579f64 00f4ec94 00f4ef14 be3535e0 coreclr!RunMain+0x15c
00f4eeb8 72533a04 00f4ef14 bec1db58 be353474 coreclr!Assembly::ExecuteMainMethod+0x90
00f4ef2c 7252241c 0130f70c 00000001 01335010 coreclr!CorHost2::ExecuteAssembly+0x124
00f4ef88 728eba91 0130f70c 00000001 00000000 coreclr!coreclr_execute_assembly+0x9c
00f4efb0 728eb35f 0130f70c 00000001 00000000 hostpolicy!coreclr::execute_assembly+0x31
00f4f39c 728eb91e 00f4f3b8 00000000 728eb7f0 hostpolicy!run+0xb7f
00f4f460 72966d9d 00000002 012f2a50 00f4fa18 hostpolicy!corehost_main+0x12e
00f4f48c 7296bb17 00f4f7ac 00f4f4e4 00000002 hostfxr!execute_app+0xad
00f4f8ac 7296af40 00f4fa18 00f4f968 00f4f8ec hostfxr!fx_muxer_t::read_config_and_execute+0x917
00f4f990 7296984c 00f4fa18 00f4fa60 00000001 hostfxr!fx_muxer_t::parse_args_and_execute+0x3a0
00f4fa88 00db897a 00000002 012f2a50 ce00d9ee hostfxr!fx_muxer_t::execute+0x21c
00f4fb00 00db8a9a 00000002 012f2a50 72a672e4 dotnet!run+0xea
00f4fb18 00dbb54c 00000002 012f2a50 01302188 dotnet!wmain+0x6a
00f4fb64 76ad86e3 7f0a3000 00f4fbb4 76f29e89 dotnet!operator new[]+0x1bd
00f4fb70 76f29e89 7f0a3000 b8278a50 00000000 kernel32!BaseThreadInitThunk+0xe
00f4fbb4 76f29e5c 00dbb5c9 7f0a3000 ffffffff ntdll!__RtlUserThreadStart+0x72
00f4fbcc 00000000 00dbb5c9 7f0a3000 00000000 ntdll!_RtlUserThreadStart+0x1b

THREAD_SHA1_HASH_MOD_FUNC:  49f97635abbc25a2c583889281547b7ba354f785

THREAD_SHA1_HASH_MOD_FUNC_OFFSET:  ae2cb61274f8d3d60fb9de804764c5183c23ba84

THREAD_SHA1_HASH_MOD:  b1aa7b978e085b437fa1a15beba996cae47b9f85

FOLLOWUP_IP: 
coreclr!Thread::DoAppropriateWaitWorker+f9
725723a7 8bf0            mov     esi,eax

FAULT_INSTR_CODE:  15fff08b

SYMBOL_STACK_INDEX:  2

SYMBOL_NAME:  coreclr!Thread::DoAppropriateWaitWorker+f9

FOLLOWUP_NAME:  MachineOwner

MODULE_NAME: coreclr

IMAGE_NAME:  coreclr.dll

DEBUG_FLR_IMAGE_TIMESTAMP:  58ed17f5

STACK_COMMAND:  dt ntdll!LdrpLastDllInitializer BaseDllName ; dt ntdll!LdrpFailureData ; ~0s; .ecxr ; kb

BUCKET_ID:  BREAKPOINT_NOSOS_coreclr!Thread::DoAppropriateWaitWorker+f9

FAILURE_EXCEPTION_CODE:  80000003

FAILURE_IMAGE_NAME:  coreclr.dll

BUCKET_ID_IMAGE_STR:  coreclr.dll

FAILURE_MODULE_NAME:  coreclr

BUCKET_ID_MODULE_STR:  coreclr

FAILURE_FUNCTION_NAME:  Thread::DoAppropriateWaitWorker

BUCKET_ID_FUNCTION_STR:  Thread::DoAppropriateWaitWorker

BUCKET_ID_OFFSET:  f9

BUCKET_ID_MODPRIVATE: 1

BUCKET_ID_MODTIMEDATESTAMP:  58ed17f5

BUCKET_ID_MODCHECKSUM:  39c35c

BUCKET_ID_MODVER_STR:  4.6.25211.1

BUCKET_ID_PREFIX_STR:  BREAKPOINT_NOSOS_

FAILURE_PROBLEM_CLASS:  BREAKPOINT

FAILURE_SYMBOL_NAME:  coreclr.dll!Thread::DoAppropriateWaitWorker

FAILURE_BUCKET_ID:  BREAKPOINT_NOSOS_80000003_coreclr.dll!Thread::DoAppropriateWaitWorker

WATSON_STAGEONE_URL:  http://watson.microsoft.com/StageOne/dotnet.exe/1.0.1.4500/57621878/unknown/0.0.0.0/bbbbbbb4/80000003/00000000.htm?Retriage=1

TARGET_TIME:  2017-07-01T08:14:23.000Z

OSBUILD:  9200

OSSERVICEPACK:  16859

SERVICEPACK_NUMBER: 0

OS_REVISION: 0

OSPLATFORM_TYPE:  x86

OSNAME:  Windows 8

OSEDITION:  Windows 8 Server TerminalServer DataCenter SingleUserTS

USER_LCID:  0

OSBUILD_TIMESTAMP:  2014-03-01 07:59:28

BUILDDATESTAMP_STR:  140228-1809

BUILDLAB_STR:  win8_gdr

BUILDOSVER_STR:  6.2.9200.16859

ANALYSIS_SESSION_ELAPSED_TIME:  15812

ANALYSIS_SOURCE:  UM

FAILURE_ID_HASH_STRING:  um:breakpoint_nosos_80000003_coreclr.dll!thread::doappropriatewaitworker

FAILURE_ID_HASH:  {138a4838-72cf-1224-f542-9293bd8cfbc2}

Followup:     MachineOwner
raRaRa commented 7 years ago

Any of you using Azure Storage? My app was failing because I had over 700 threads running at the same time, causing some kind of deadlock.

In my memory dump I found out that the Azure storage calls were causing it. The Azure Storage package is dependent on OData, which isn't asynchronous yet. I had to write my own Azure storage table client to get it fixed.

razzemans commented 7 years ago

Forget about my dump, that one is from an instance that was working fine according to Azure support. Guess I accidentally made a dump from the wrong instance :(

@raRaRa let me check with my team. I know we use it for DataProtection but I think that is run only once on startup. Thanks for the information.

omerv2 commented 7 years ago

Same here! When I publish my app on prod config, after a while the program crashes randomly and starts to respond with 50x erros, I got stuck. Thanks

razzemans commented 7 years ago

@muratg @Tratcher Any update on this issue since it has a "Working" label? The last 2 days we've run into massive 503 errors on production during peak load. Rebooting our instances would only help for maybe 5 minutes and then they would return. Yesterday it started with errors in Application Insights regarding Redis:

No connection is available to service this operation: GET; It was not possible to connect to the redis server(s); ConnectTimeout

This went on for about 15 minutes and then one of our instances would simply die and start serving these 503 errors.

We, and our customer, are getting extremely frustrated by this issue since:

We have become so frustrated that we're seriously starting to consider extreme other options, e.g. dropping .NET Core altogether or hosting on Google Cloud to see if that makes any difference. I hope an update can be given, some idea what causes this, what a workaround may be, anything. We need this to be fixed and very soon. Thank you.

Tratcher commented 7 years ago

@razzemans please have your azure support rep contact me directly so we can coordinate our analysis.

Tratcher commented 7 years ago

@1amirjalai I've been reviewing this thread and your issue looks different from the original post. I do think you should change to PartialAsync as originally suggested. Having that sync method in the middle of your async chain is causing threads to get blocked.

Tratcher commented 7 years ago

@razzemans the reason connection exhaustion issues present errors in this way for ASP.NET Core is that we're using a reverse proxy approach and tunnel requests from IIS to a back-end process. If you've exhausted your connections then IIS can't even connect to your app and you're stuck with an ugly IIS error page.

What's causing the connection/port exhaustion is another matter. That's what I want to talk to your support repo about.

razzemans commented 7 years ago

I appreciate the reply. I will ask my support engineer to contact you. Meanwhile, do you have any idea for a workaround? Obviously rebooting does not fix the issue correctly. My best guess it that on a TCP level sockets are being kept open and restarting the Azure instance does not help as that won't close them. If that is the cause, that is understandable, and fine even as long as we have a good workaround. Then this issue wouldn't be nearly as bad.

Tratcher commented 7 years ago

No, we don't understand enough about the problem to suggest workarounds. The OS reserves closed sockets for a given time period before they can be reused and that makes recovering after the fact quite difficult. Practical workarounds need to focus on preventing the exhaustion, and to do that we need to know which component to blame.

Tratcher commented 7 years ago

@razzemans I spoke with your support rep and they said they have been making some progress with you recently. They will continue to investigate and work with us as needed.

razzemans commented 7 years ago

@Tratcher I think we have found the root cause of the CGI exception. Somewhere in our code we accidentally added the async modifier to the Process method of a taghelper, resulting in an async void method, instead of using the correct async Task ProcessAsync method. Here is code to reproduce the CGI error:

[HtmlTargetElement("foobar", TagStructure = TagStructure.NormalOrSelfClosing)]
public class TestTagHelper : TagHelper
{

    public async override void Process(TagHelperContext context, TagHelperOutput output)
    {
        throw new Exception("oh no");
    }
}   

Calling this taghelper will crash the dotnet process:

<foobar></foobar>

If TagHelperOutput.GetChildContentAsync() throws an exception, it will also crash, for example, when you call it like this:

<foobar>@("bla".ElementAt(5))</foobar>

If I run this code on Azure, sure enough, it will cause a 503 CGI error right away. If we reload the page, the next requests to the application (whatever page) will take a long time to complete, probably because the whole application reboots.

My theory: During peak load, our taghelper might throw an exception, e.g. because of a timeout to an external service, this exception is not handled but bubbles up, causing the CGI error. This will cause the dotnet process on the instance to crash, the application restarts, increasing http requests queues to high numbers, increasing the chance of a timeout (and thus an exception) to happen again.

I hope you agree that although having the async void method on the TagHelper was our fault, it should not cause this error. Looking at the code of @1amirjalai he was also using a taghelper, maybe making the same mistake., which is easy to make. I therefore hope this will get fixed very soon. Meanwhile, we will fix our TagHelper and cross our fingers the issue is resolved.

We have also been able to reproduce this in .NET Core 2.0 and confirm it is also present in that version.

Tratcher commented 7 years ago

I'm not aware of any possible framework fix for misuse of async void. As a rule we tell people not to use it. At best we could provide a static analyzer flagging it as a warning.

razzemans commented 7 years ago

I understand that, but surely there must be some way of handling this better? Preventing it from dotnet crashing? One tiny mistake in a taghelper bringing down a 20 instance Azure production environment and having no real way of recovering from it. That's huge. Even more so as it is seemingly extremely hard to debug in a large codebase.

Tratcher commented 7 years ago

That's a runtime 'feature'. async void pulls you out of the request execution flow and onto a background thread. .NET Core 2.0 brought back AppDomain.UnhandledException so you can see exceptions from background threads, but I'm not aware of a way to handle them. https://github.com/dotnet/corefx/pull/11275/files#diff-e17a6ed29a0800366d75c1548f5bbd5fR17

Tratcher commented 7 years ago

@davidfowl any suggestions accidental async void methods and unhandled exceptions on background threads? (only today's comments above are relevant).

razzemans commented 7 years ago

@Tratcher @davidfowl

Shameless bump. The Azure support engineer replied to me and said the .NET Core team rejected this issue as this is by design.

Needless to say, I disagree with this decision. I think that accidentally adding a single keyword (async) to a void method should not be able to crash the whole process and causing such an ugly error page in production. It might work for weeks at a time until an exception happens. In the mean time you've made numerous deployments and you have no idea where the error is. There aren't any crash logfiles written with a stacktrace detailing the error, at least we would be able to spot the error soon. But now it has taken us 5 months to debug this, causing much frustration. I'm sure we aren't the first, nor the last ones to run into this issue.

So, again, I feel some solution is warranted. Crashing runtimes without any debug information is not an acceptable thing.

davidfowl commented 7 years ago

It unfortunately can't be detected in cases like you specified above. One thing that might help is logging appdomain unhandled exceptions in the hosting layer. That might help. For things we invoke dynamically, it might be possible to detect an async void bad pattern (like controller actions) but when we provide an abstract base class with a virtual thats overridden we just call the method expecting it to respect the semantics of the specific overload (void meaning synchronous).

razzemans commented 7 years ago

Ok, after we upgrade to 2.0 I will try and log AppDomain unhandled exceptions and see how that goes. Note that we still have random 500 CGI errors where are whole application becomes unavailable, caused by (apparently) another issue (https://github.com/dotnet/corefx/issues/24421).

razzemans commented 7 years ago

I think this is now a duplicate of https://github.com/dotnet/corefx/issues/24421 and if nobody objects I will close this one.