dotnet / aspnetcore

ASP.NET Core is a cross-platform .NET framework for building modern cloud-based web applications on Windows, Mac, or Linux.
https://asp.net
MIT License
35.44k stars 10.03k forks source link

Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Unexpected end of request content. #23949

Open Tino92 opened 4 years ago

Tino92 commented 4 years ago

We are facing intermittently BadHttpRequestException: Unexpected end of request content.

Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException:
   at Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException.Throw (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.Http1ContentLengthMessageBody+<ReadAsyncInternal>d__9.MoveNext (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpRequestStream+<ReadAsyncInternal>d__30.MoveNext (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.WebUtilities.FileBufferingReadStream+<ReadAsync>d__36.MoveNext (Microsoft.AspNetCore.WebUtilities, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.WebUtilities.StreamHelperExtensions+<DrainAsync>d__3.MoveNext (Microsoft.AspNetCore.WebUtilities, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Formatters.NewtonsoftJsonInputFormatter+<ReadRequestBodyAsync>d__13.MoveNext (Microsoft.AspNetCore.Mvc.NewtonsoftJson, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder+<BindModelAsync>d__7.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder+<BindModelAsync>d__7.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider+<>c__DisplayClass0_0+<<CreateBinderDelegate>g__Bind|0>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker+<<InvokeInnerFilterAsync>g__Awaited|13_0>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker+<<InvokeNextResourceFilter>g__Awaited|24_0>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.Rethrow (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.Next (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.InvokeFilterPipelineAsync (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker+<<InvokeAsync>g__Logged|17_1>d.MoveNext (Microsoft.AspNetCore.Mvc.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Routing.EndpointMiddleware+<<Invoke>g__AwaitRequestTask|6_0>d.MoveNext (Microsoft.AspNetCore.Routing, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware+<Invoke>d__5.MoveNext (Microsoft.AspNetCore.Authorization.Policy, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware+<Invoke>d__6.MoveNext (Microsoft.AspNetCore.Authentication, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpProtocol+<ProcessRequests>d__214`1.MoveNext (Microsoft.AspNetCore.Server.Kestrel.Core, Version=3.1.5.0, Culture=neutral, PublicKeyToken=adb9793829ddae60)

We are running on .NET Core 3.1.5, this exception seemed to only appear when we moved over to .NET Core 3.0

There have been similiar issues opened in the past, https://github.com/dotnet/aspnetcore/issues/19476#issuecomment-629457165 and https://github.com/dotnet/aspnetcore/issues/6575.

This exception seems to be thrown when a client aborts mid-request. My question is, should this be logged as a warning instead of an exception? It creates a lot of noise in our logs.

sunnamed434 commented 5 months ago

I experience the same issue on .NET 7.0, and I also have the global exception handler middleware but this not helping me :P

taylaninan commented 5 months ago

If you want, I can provide the simple API source codes (approx. 150-200 lines) for inspection, which are causing this specific error under load (stress tests).

waszakCeneo commented 5 months ago

For us on .Net 6 issue is triggered by updating one package. Any other package didn't cause issues

 <PackageReference Include="Microsoft.Azure.Cosmos" Version="3.29.0" />
to
 <PackageReference Include="Microsoft.Azure.Cosmos" Version="3.35.2" /> (this we tried in july 2023)
<PackageReference Include="Microsoft.Azure.Cosmos" Version="3.36.0" /> (this in december 2023)
<PackageReference Include="Microsoft.Azure.Cosmos" Version="3.39.0" /> (this in april 2024)
<PackageReference Include="Microsoft.Azure.Cosmos" Version="3.31.2" /> (this in 29-30 april 2024)

So we tried upgrade to .net 8(without updating Microsoft.Azure.Cosmos) and it also caused the same issue so it wasn't only releated to one package. The diffrence is that it logs like 499 instead of 400. We then tried updating other Azure packages image But didn't solved issue

We gonna test updating System.Text.Json to latest version also(we skipped that because there are some breaking chages between 6 and 8)

tldr; On .net 6 certain packages cause spikes in Kestrel (Like Cosmosdb ) On .net 8 it doesn;t matter upgrading .net is enough with mandatory packages to cause issue.

Also it only shows under stresstest or production. Its very hard to reproduce.

perum commented 5 months ago

I'm seeing this now on .net 8 Isolated after upgrading from .net 6 / in-proc. Does anyone have a filter for these exceptions?

harshpatel94 commented 5 months ago

Experiencing the same issue with my .NET 6.0, .NET 7.0, and .NET 8.0 apps. Surprisingly, this only happens when the apps are under load. Any resolution or workaround would be super helpful. Thank you.

knopa commented 5 months ago

How to log that exception if it availbe only if Kestrel in debug info? image

sunnamed434 commented 5 months ago

If you want, I can provide the simple API source codes (approx. 150-200 lines) for inspection, which are causing this specific error under load (stress tests).

@taylaninan can you provide it please? I'd try to make a workaround for this and I need smth to test on. I have an idea to patch the method which cause this problem via Harmony library

sunnamed434 commented 5 months ago

I haven't gotten that problem for more than 2 weeks yet even with the high load, what I did is subscribed to AppDomain.CurrentDomain.UnhandledException. I also moved my API from Ubuntu to Debian, and moved from systemctl to Dockerfile to launch my API, I am not sure if that was the solution but error is gone or at least the error is logged and the API not being crashed

prakashjvm commented 4 months ago

I am able to replicate the error by having postman call with 100 request with 5 mins ramping up. I got approx 58% failure with the same error message.

knopa commented 4 months ago

Usually that kind of error means request was aborted from client side

mfcguin commented 4 months ago

@supermihi , sorry for missing this message, you can do what as @deleteLater proposed.,很抱歉错过了这条消息,您可以按照建议执行操作。 For our project, we use a custom exception handler middleware to ignore the BadRequestException对于我们的项目,我们使用自定义异常处理程序中间件来忽略 BadRequestException

public class CustomExceptionHandlerMiddleware
{
    private readonly ILogger<CustomExceptionHandlerMiddleware> _logger;
    private readonly RequestDelegate _next;

    public CustomExceptionHandlerMiddleware(ILogger<CustomExceptionHandlerMiddleware> logger, RequestDelegate next)
    {
        _logger = logger;
        _next = next;
    }

    public async Task Invoke(HttpContext context)
    {
        try
        {
            await _next(context);
        }
        catch (BadHttpRequestException ex)
        {
            _logger.LogWarning(ex, "BadHttpRequestException");
        }
        catch (Exception exception)
        {
            _logger.LogError(exception, $"Exception when invoke {context.Request.Method}:{context.Request.GetDisplayUrl()
        }
    }
}

Hello Why do I define a middleware that does not have an exception, but a BadHttpRequestException that occurs normally after the next method

AnisTigrini commented 1 month ago

This is still happening. We are using dotnet 8 and are running on a k8s on Azure. As soon as we stress the pod a little with multiple concurrent connections the exception is being thrown.

michaelmarcuccio commented 1 month ago

@AnisTigrini are you using app insights profiler?

services.AddServiceProfiler

After I disabled this, it fixed this issue for me. This is the info that got me there: https://stackoverflow.com/questions/77855606/should-we-enable-azure-application-insights-profiler-in-production

AnisTigrini commented 1 month ago

Hey there @michaelmarcuccio thanks for the quick reply. We actually do not have the app insight profiler it enabled on azure. So it makes me think it might be the number of concurrent connection setup maybe.

AnisTigrini commented 2 weeks ago

By the way, for anyone that experienced the same problem, here is what I found. So we are running a pod with dotnet 8 using the official Microsoft image mcr.microsoft.com/dotnet/sdk:8.0.  The pod we had is a dotnet REST API that does a lot of HTTP calls to other services. It's kind of a proxy to be honest.

I tried to contact API enpoints that do not make any HTTP calls, and the server returned a response immediately, so I knew that the problem was occurring when the server was making HTTP calls to other services. 

I tried to take a look by downloading netstat into the pod and realized that the problem was socket starvation. In summary, we were using a deprecated API that Dotnet recommended avoiding. (WebClient).

The solution for us was to do a migration and replace all those calls with the recommended API (HTTPClient).   Furthermore, you should avoid using that class once per request, as it can also cause socket starvation. It is meant to be used as a singleton or using the HttpClientFactory with DI.

I hope this helps some of you that are dealing with the issue.