Closed tlrx closed 9 years ago
I like the solution a lot. I left some comments here and there since I think we should break out the buffering for the S3Output
and hammer it with some randomized testing.
Thanks for the review, I'll update the code. All credits go to @imotov
@s1monw @imotov @dadoonet can you review please?
I like it - left one comment
This looks a great move forward! Congrats @tlrx
the latest changes LGTM - @imotov your call
Left a couple of really minor comments. But overall it looks good to me.
Adding a S3OutputStream that upload blobs to the S3 Storage service with two modes (single/multipart). When the length of the chunk is lower than buffer_size, the chunk is uploaded with a single request. Otherwise multiple requests are made, each of buffer_size (except the last one which can be lower than buffer_size).
For example, when uploading a blob (say, 1Gb) with chunk_size set for accepting large chunks (chunk_size = 5Gb) and buffer_size set to 100Mb, the blob will be sent into 10 multiple parts, each of ~100Mb. Each part upload may failed independently and will be retried 3 times.
Closes #117