ienzam / s3-multipart-upload-browser

This is a demonstration of how to upload from browser to amazon s3 directly in chunk and resumable way with php as backend.
Other
82 stars 16 forks source link

First part of file is missing #5

Open ilatif opened 11 years ago

ilatif commented 11 years ago

When we upload a file then this.blobs contains all blobs but not first blob. For example if you want to upload a file of 130 KB and you have set this.PART_SIZE to 100 KB then this.blobs will have 2 blobs. First blob will be of 30 KB and second blob will be of 0 KB.

This issue always happen on my side? Is this happening on your side too

ienzam commented 11 years ago

What browser and version you are using?

Can you find out the reason and fix it?

A major rewrite is long due. Hopefully I will start developing "multipart upload library" which is compatible with amazon aws within a month.

ilatif commented 11 years ago

I' am using latest version of Chrome on Mac OS X 10.7.5. Yeah I figured out reason for this issue too. Basically the calculation you are doing to slice file part is creating this problem. After initiating multipart request you are calling uploadPart(1). This 1 is creating problem in slicing out file. I' am sure you will agree with me. :smile:

ienzam commented 11 years ago

Hi @ilatif can you please checkout the commit "cc8207db0fa6d1b069ba9899e66df18eb242551f" and see if it is working?

ilatif commented 11 years ago

Hi @ienzam. Really sorry! for late reply. I was sick since last two days so I haven't get a chance to checkout commit you mentioned. I will do my best to check this as soon as possible.

Thanks.

ilatif commented 11 years ago

Hi @ienzam I have created my own little version of your library according to my needs and it is sending Ajax Requests one after another instead of sending all Ajax Requests at once. I mean if I have set that 4 parallel parts should be uploaded then it will create 04 initial requests and then on completing each subsequent request it will initiate another Ajax Request so in this way we will have 04 parallel Ajax Requests running at a time thus resulting in reducing overhead for Browsers for handling many Ajax Request.

If you say then I will integrate it with upload.js in your library with Ajax handling code and we can see if it suits our needs.

ienzam commented 11 years ago

@ilatif The first version of the library was just doing this serial upload.

Seems like there may exists a problem in parallel upload.

@thecolorblue can you please check with @ilatif and find out if there is any problem.

ilatif commented 11 years ago

@ienzam Yeah we can call this semi-parallel because not all file parts are uploading in parallel but some of them are uploading in parallel and each part that gets uploaded start uploading next pending part.

I' am not sure but I think having 10 or 15 parallel requests of all file parts will create a problem for us in terms of Speed. I' am not sure but I think Ajax requests use our Internet's Bandwidth at the same time so higher number of Ajax Requests means in slow uploading speed of data because our Bandwidth is being getting used by all Ajax Requests at once. The real advantage of parallel requests is that we don't have to wait for big requests to get finished before we proceed with small requests. What you think about this?

ienzam commented 11 years ago

Hi @ilatif sorry for late reply

Have you tried the commit cc8207db0fa6d1b069ba9899e66df18eb242551f ?

ilatif commented 11 years ago

Hi @ienzam no worries regarding late reply. I haven't tried the commit you mentioned yet :-(. I was a bit busy in rewriting upload.js according to my needs and I' am now pretty done with it. I have created separate classes for each set of tasks.

UploadManager is used to handle uploading of a file. This class is responsible for making new file parts and _initiate_multipartrequest from server etc.

Uploader contains methods that are used to do uploading of file / file_part to Amazon S3 and they are quite generic.

FilePart represents a sliced file part and is responsible to upload himself and notify it's manager whether it has finished uploading successfully or not. FilePart inherits methods from Uploader to do uploading of file_part.

I have also displayed individual file_part progress and complete file_progress. If you want to check out my code then I will share it with you.

Thanks.

ienzam commented 11 years ago

Yes sure, please fork the project on github and push your commits there

ilatif commented 11 years ago

Thanks @ienzam. I will fork out the repo and will share link with you.

alecsvaldez commented 7 years ago

Hi, sorry for reopen this but I recently used this code in a project and i have the same issue so I did this in upload.js:92, inside uploadPart function, next to some other modifications :

        var start = 0;
        var parts =0;
        ...
        while(start < this.file.size) {
            // I move the start variable definition from here...
            end = Math.min(start + this.PART_SIZE, this.file.size);
            filePart = this.file.slice(start, end);
            // this is to prevent push blob with 0Kb
            if (filePart.size > 0){
                parts++;
                blobs.push(filePart);
                console.log('Part ' + (parts) + ': ' + start + '-' + end);
            }
            // to here...
            start = this.PART_SIZE * this.curUploadInfo.partNum++;
        }

I hope i can upgrade this example to aws v3 and eventually sharing with you... thx

ilatif commented 7 years ago

Good work @alecsvaldez 👍.