nathanpeck / s3-upload-stream

A Node.js module for streaming data to Amazon S3 via the multipart upload API
MIT License
347 stars 46 forks source link

Resuming in a later session results in corrupted uploads #56

Closed Rambarani closed 5 years ago

Rambarani commented 5 years ago

This is duplicate of this issue . But that one not answered . i am facing also the exact same problem . can you help me guys please ??

How is one supposed to track parts as they are uploaded? Our app needs to upload giant files, and we need to handle poor internet connections. So there's not always a chance that the upload will ever send out a pause event.

Currently we are tracking parts from the part event and adding them to an array. When I use the UploadId and Parts from a previous session, s3-upload-stream always starts from the beginning and ends up uploading too much, corrupting the file.

Here's how we're calling s3-upload-stream

const destinationDetails = { // ... }

const sessionDetails = { "UploadId": "SECRET_UPLOAD_ID", "Parts": [ { "ETag": "\"ee72b288175a979f6475b7241dcb9913\"", "PartNumber": 1 }, { "ETag": "\"e33260ba6180f1294592f25985c0109e\"", "PartNumber": 2 } ] }

const uploader = s3Stream.upload(destinationDetails, sessionDetails)

uploader.on('part', details => { console.log(details) }) And if we see what the part event logs:

{ "ETag": "\"ee72b288175a979f6475b7241dcb9913\"", "PartNumber": 3 } It's the exact same ETag we passed in via sessionDetails, and it gets uploaded as another part, which corrupts the file.

nathanpeck commented 5 years ago

Pause happens locally in the browser so you don’t have to worry about internet stopping the pause. But you do have to keep track of the uploaded parts yourself, maybe in local storage

On Sat, Jun 1, 2019 at 9:26 AM RamK777 notifications@github.com wrote:

This is duplicate of this issue https://github.com/nathanpeck/s3-upload-stream/issues/45 . But that one not answered . i am facing also the exact same problem . can you help me guys please ??

How is one supposed to track parts as they are uploaded? Our app needs to upload giant files, and we need to handle poor internet connections. So there's not always a chance that the upload will ever send out a pause event.

Currently we are tracking parts from the part event and adding them to an array. When I use the UploadId and Parts from a previous session, s3-upload-stream always starts from the beginning and ends up uploading too much, corrupting the file.

Here's how we're calling s3-upload-stream

const destinationDetails = { // ... }

const sessionDetails = { "UploadId": "SECRET_UPLOAD_ID", "Parts": [ { "ETag": ""ee72b288175a979f6475b7241dcb9913"", "PartNumber": 1 }, { "ETag": ""e33260ba6180f1294592f25985c0109e"", "PartNumber": 2 } ] }

const uploader = s3Stream.upload(destinationDetails, sessionDetails)

uploader.on('part', details => { console.log(details) }) And if we see what the part event logs:

{ "ETag": ""ee72b288175a979f6475b7241dcb9913"", "PartNumber": 3 } It's the exact same ETag we passed in via sessionDetails, and it gets uploaded as another part, which corrupts the file.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/nathanpeck/s3-upload-stream/issues/56?email_source=notifications&email_token=AA2VN6UREDI5EWUOGYSSVL3PYJ2HPA5CNFSM4HSAQFQKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4GXDAUNA, or mute the thread https://github.com/notifications/unsubscribe-auth/AA2VN6WVM5U33HY4JQ5U4VTPYJ2HPANCNFSM4HSAQFQA .

Rambarani commented 5 years ago

Yes i keep tracking uploaded parts in db and send it as sessionDetails const uploader = s3Stream.upload(destinationDetails, sessionDetails)

but the final uploaded file size is larger than original file size. and it is corrupted , not able to open.

i think the part buffer we are sending is wrong.

my case is, close the application and reopen means upload is to resume where is stopped. so that time only this issue happening.

we have to send the part Buffer from where it paused. but it starts sending from the beginning i think so.

Note: Pause , Resume works well in normal condition (Not closes the application) . close and opens the application means this issue happening. and i am using this package in Electron with react app

Rambarani commented 5 years ago

finally i got . i read the stream where i last time stopped fs.createReadStream(path, { start: this.state.uploadedSize });