Open dmolesUC opened 5 years ago
I put in a fix to set PartSize
to make sure MaxUploadParts
isn't exceeded. It looks, though, like the S3 SDK wants to keep the entire part in memory, which means the maximum object size we can upload is still limited by available RAM.
On a smallish EC2 instance with 2 GB RAM, it seems like a 128 MiB PartSize
is OK, but 1 GiB leads to OOM errors after uploading a few gigabytes. (With the 128 MiB PartSize, resident memory peaks at about 1.6 GiB, according to top
.)
At 128 MiB and a maximum of 10K parts, we'd be limited to about 1.22 TiB or 1.34 TB. It would be good to keep digging in the API and see what facilities there are for explicit multipart uploads.
Steps to reproduce:
Execute the command below:
Expected:
Upload completes in about 4 hours (assuming a fast client network)
Actual
Upload fails at 50G with: