dotnet / aspnetcore

ASP.NET Core is a cross-platform .NET framework for building modern cloud-based web applications on Windows, Mac, or Linux.
https://asp.net
MIT License
35.23k stars 9.95k forks source link

Multiple files upload hangs the process in a blazor server app #47301

Open sabrite opened 1 year ago

sabrite commented 1 year ago

Is there an existing issue for this?

Describe the bug

I have a Blazor Server app which is calling an API to upload multiple images. With, one or more images selected using InputFile control, it always works:

  1. When a single file is uploaded, no matter what the size is (I am trying images of up to 10MB atm)
  2. When two or more images are selected and total size of all images is less than 1MB

And it doesnt work:

  1. Two or more images are selected, with each image size greater than or equal to 2MB. The request made through HttpClient just dies away (it never reaches the API Controller).

Here is the code for anyone who wants to quickly reproduce the issue:

Index.Razor

@page "/"
@using System.Net.Http.Headers
@inject HttpClient _httpClient

<PageTitle>Index</PageTitle>

<InputFile class="custom-file-input" OnChange="@LoadFiles" multiple="multiple" />
<button class="btn btn-primary" type="button" @onclick="UploadFiles">Upload</button>

@code{
    private IReadOnlyList<IBrowserFile>? _images;

    private async Task LoadFiles(InputFileChangeEventArgs e)
    {
        _images = e.GetMultipleFiles();
    }

    private async Task UploadFiles()
    {
        const long maxFileSize = 1024 * 1024 * 300;
        _httpClient.Timeout = TimeSpan.FromMinutes(10);
        var uri = "https://localhost:7112/Files";
        using var content = new MultipartFormDataContent();

        foreach (var file in _images)
        {
            var fileContent = new StreamContent(file.OpenReadStream(maxFileSize));
            fileContent.Headers.ContentType = new MediaTypeHeaderValue(file.ContentType);
            content.Add(fileContent, "files", file.Name);
        }

        var response = await _httpClient.PostAsync(uri, content);

    }

}

and here is the controller:

[HttpPost]
        public void Upload([FromForm] IEnumerable<IFormFile> files)
        {
            var count = files.Count();
            //Do something with files
        }`

In the API project, I have tried doing this:
`builder.WebHost.UseKestrel(options =>
{
    options.Limits.MaxRequestBodySize = null;
});

and in the Blazor Server project, tried this:

builder.Services.AddServerSideBlazor(options =>
{
    options.DetailedErrors = true;
    options.DisconnectedCircuitMaxRetained = 100;
    options.DisconnectedCircuitRetentionPeriod = TimeSpan.FromMinutes(3);
    options.JSInteropDefaultCallTimeout = TimeSpan.FromMinutes(10);
    options.MaxBufferedUnacknowledgedRenderBatches = 10;
}).AddHubOptions(options =>
{
    options.ClientTimeoutInterval = TimeSpan.FromMinutes(10);
    options.EnableDetailedErrors = true;
    options.HandshakeTimeout = TimeSpan.FromSeconds(15);
    options.KeepAliveInterval = TimeSpan.FromSeconds(15);
    options.MaximumParallelInvocationsPerClient = 1;
    options.MaximumReceiveMessageSize = 32 * 1024 * 1024;
    options.StreamBufferCapacity = 33554432;
});

Nothing seems to work. Not sure it is a bug or I am missing something very obvious. I appreciate any help. Thanks,

Expected Behavior

All files should be successfully posted to backend API, no matter what the size of each file is.

Steps To Reproduce

No response

Exceptions (if any)

No response

.NET Version

.NET 7

Anything else?

No response

MackinnonBuck commented 1 year ago

Thanks for reaching out, @sabrite. Could you please provide server-side logs to help us determine what's going on here (preferably trace-level) ASP.NET Core Logging and provide the logs from when the issue occurs?

ghost commented 1 year ago

Hi @sabrite. We have added the "Needs: Author Feedback" label to this issue, which indicates that we have an open question for you before we can take further action. This issue will be closed automatically in 7 days if we do not hear back from you by then - please feel free to re-open it if you come back to this issue after that time.

sabrite commented 1 year ago

Thanks @MackinnonBuck - There is nothing in the logs whatsoever. The only thing that is recorded by Blazor Server app is the http request it is sending to the API. I cant see anything with tracing enabled and even using IntelliTrace.

I have created a repo with bare minimum code. You should be able to reproduce this by "Uploading 3 or more files, each of 2MB or more". Here is the repo https://github.com/sabrite/BlazorFileUpload

I will look forward to your inputs. Thanks,

sabrite commented 1 year ago

@SteveSandersonMS @danroth27 can someone please look into this?

SteveSandersonMS commented 1 year ago

I've tracked this down to a defect in the backpressure mechanism with JS streams.

Workaround

One workaround is simply to read the whole file into a byte[] memory on the .NET side before constructing the HTTP request content. The key thing is that each time you call OpenReadStream, you have to read that entire file before you call OpenReadStream on the next file. Reading multiple streams in parallel within a single user circuit is what causes the problem.

Another workaround is to change the SignalR MaximumParallelInvocationsPerClient setting to a number equal or greater than the largest number of concurrent uploads you want to be able to handle. For example, to make two parallel uploads work within the same circuit, set MaximumParallelInvocationsPerClient = 2. By default the value is 1.

Reason

The backpressure mechanism is having a Pipe, with the JS side sending calls that write to it, and the .NET side doing stream operations that read from it. That's great except if the .NET side chooses not to do read operations from it, at which point the pipe writer will fill up.

The problem goes away if SignalR accepts 2 concurrent calls within the circuit, because even though the write call for stream B is blocked, the JS side is still trying to do writes for stream A, so that never runs out of data. And then once the .NET side has finished reading A, it starts reading B, so everything continues and completes.

Solution

We need to change the backpressure mechanism so that, if the .NET side is unable to write to the pipe due to backpressure, then instead of awaiting, it should reject the write instantly and return a message to the JS side saying the write was rejected due to backpressure and it has to try again later. The JS side should then wait for a short period (say, 1 second, or maybe some exponential backoff with a max of 10 seconds or so) before trying again.

This way, the circuit will never have any long-pending async operations due to backpressure.

SteveSandersonMS commented 1 year ago

@mkArtakMSFT Removing my assignment now the investigation is done. This is definitely a bug.

sabrite commented 1 year ago

Thank you @SteveSandersonMS - much appreciated. Since no one responded for almost a month, my work around was to use MemoryStream like this and it always worked irrespective of the number and size of individual/all files (and now I know the reason why it works :)

image

In my particular case, the user could upload N number of images of size < 300MB, so its hard to set MaximumParallelInvocationsPerClient in advance. Nonetheless, its great to know the cause and hope it gets fixed soon.

Thanks again.

ghost commented 1 year ago

Thanks for contacting us.

We're moving this issue to the .NET 8 Planning milestone for future evaluation / consideration. We would like to keep this around to collect more feedback, which can help us with prioritizing this work. We will re-evaluate this issue, during our next planning meeting(s). If we later determine, that the issue has no community involvement, or it's very rare and low-impact issue, we will close it - so that the team can focus on more important and high impact issues. To learn more about what to expect next and how this issue will be handled you can read more about our triage process here.

DontEatRice commented 3 months ago

This error still occures in .NET 8, solution provided by @sabrite is what we are trying to avoid, because we expect that client can upload up to 50MB and in this scenario, 50MB will be loaded into MemoryStream. I see that this issue is in .NET 9 planning, will it make it to the .NET 9 release?