Genbox / SimpleS3

A .NET Core implementation of Amazon's S3 API with focus on simplicity, security and performance
MIT License
44 stars 8 forks source link

Backblaze upload failure with files exceeding a trivial size. #55

Closed Ephron-WL closed 1 year ago

Ephron-WL commented 1 year ago

Description of the bug Backblaze upload HTTP client issues when file is larger than a 26KB. I was able to test this successfully with a 26KB file, but when I tested 460KB or larger I obtain this error after a few seconds. I've attempted to debug the issue, but the call stack is a bit complex and maybe the author will have an easier time.

See exception:

image

image

How to reproduce?

            ServiceCollection services = new ServiceCollection();
            services.AddBackBlazeB2(config => {
                config.Credentials = new StringAccessKey("...", "...");
                config.Region = BackBlazeB2Region.UsWest002;
            });

            ServiceProvider provider = services.BuildServiceProvider();
            var client = provider.GetRequiredService<BackBlazeB2Client>();

                        // File size: 460KB+
            var upload = client.CreateUpload("BucketName", "Technical Details.pdf");
            var bytes = File.ReadAllBytes(@"c:\temp\Technical Details.pdf");
            await upload.UploadDataAsync(bytes);

Expected behavior To upload the file without an exception.

Genbox commented 1 year ago

The exception leads me to believe it is a networking issue, but in that case, steams should not be read again. I'll try to replicate the issue.

Genbox commented 1 year ago

I can't replicate the issue with a 500 KB file, but there are issues with Backblaze when trying to upload a 5 MB file. I have tests that previously worked, and they no longer do, so I assume it is an issue with Backblaze.

I'll investigate some more.

Genbox commented 1 year ago

I found the issue. Every provider allows 2MB chunks when uploading using chunked streaming, except for Backblaze. The standard chunk size in SimpleS3 is 2MB, but Amazon recommends 80KB, so I've changed the default to 80KB for compatibility.

I'll release a new version momentarily. It would be great if you could test if your issue is fixed.

Genbox commented 1 year ago

Version 2.2.0 has been released.

Ephron-WL commented 1 year ago

I tested this release. It worked for the file in question. However, as I scaled up the file size I encountered Task timeout exceptions. There seems to be a 100 second task timeout. Maybe there is a config setting I'm missing?

    var upload = client.CreateUpload("...", "FTOS-SI-ON-9.14.2.8.bin");
    var bytes = File.ReadAllBytes(@"c:\temp\FTOS-SI-ON-9.14.2.8.bin");
    await upload.UploadDataAsync(bytes);

image

Genbox commented 1 year ago

It is not an issue with SimpleS3. I've copied your question to a discussion for you here: https://github.com/Genbox/SimpleS3/discussions/56