minio / minio-js

MinIO Client SDK for Javascript
https://docs.min.io/docs/javascript-client-quickstart-guide.html
Apache License 2.0
957 stars 279 forks source link

multipart upload and presigned urls #772

Open thomaslange24 opened 5 years ago

thomaslange24 commented 5 years ago

If i understood correctly multipart uploads are handled automatically by the putObject and getObject functions if the objects size is greater than 5 MiB. But what happens when i use a presigned url for uploading a very big object?

kannappanr commented 5 years ago

@thomaslange24 I believe you should be able to use pre-signed urls to do multi-part upload. You might have to do multiple calls like PutObject Part, Complete Multipart Upload, Initiate Multipart Upload that minio-js abstracts

thomaslange24 commented 5 years ago

But what would that look like? Minio-js does not have functions like "Complete Multipart Upload", "Initiate Multipart Upload". By abstracting you mean minio-js performs those operations in the background? Can you give an example how i would combine the multiple parts to one object, after i uploaded each part seperately with presigned urls?

kannappanr commented 5 years ago

@thomaslange24 Are you uploading the objects to MinIO or AWS S3? If AWS S3, you can upload objects of sizes upto 5GB. If it is MinIO, I think you can upload objects upto size 5TB

thomaslange24 commented 5 years ago

I am uploading the objects to MinIO. But i want to be independent of which S3 compatible Object Storage it is. So i am trying stuff to find out whether to use the minio-js or aws-sdk. I need to work with presigned urls just like in the example https://docs.min.io/docs/upload-files-from-browser-using-pre-signed-urls.html. But i still do not understand how to handle big objects as multipart upload with minio-js in the way the example shows. Another example for that would be nice.

If AWS S3, you can upload objects of sizes upto 5GB. If it is MinIO, I think you can upload objects upto size 5TB

Are you saying that i don't have to take care of multipart upload at all if it is MinIO? But how would i use minio-js then with another s3 compatible object storage.

harshavardhana commented 5 years ago

Are you saying that i don't have to take care of multipart upload at all if it is MinIO? But how would i use minio-js then with another s3 compatible object storage.

Yes for other compatible storages the AWS S3 spec level restriction applies, for multipart upload, you are better off using AssumeRole https://github.com/minio/minio/blob/master/docs/sts/assume-role.md like features which allow for rotating credentials such that you don't have to deal with complexities of pre-signed URLs anymore.

thomaslange24 commented 5 years ago

I am not sure if i really understand how this can help me. My initial problem was, that i can not use minio-js in the browser (on client side) according to the discussion https://github.com/minio/minio-js/issues/729. So i started using presigned urls. Now, AssumeRole gives me temporary credentials for the client. And since i can not use minio-js in the browser, am i forced to use the aws-sdk instead? And can i assume that other storages also provide this API?

harshavardhana commented 5 years ago

And can i assume that other storages also provide this API?

AssumeRole is implemented as per AWS STS implementation so yes aws-sdk will support this.

thomaslange24 commented 5 years ago

But again, that means i am not able to use the minio client if i want to do it this way?

thomaslange24 commented 5 years ago

I mean using the client sdk in the browser with the temporary credentials.

thomaslange24 commented 5 years ago

Well, i tried to use the aws-sdk sts for that perpose, but don't know how to address minio and passing the right user credentials to the provided functions in aws.sts

ITHcc commented 5 years ago

如果我理解正确,如果对象大小大于5 MiB,则由putObject和getObject函数自动处理多部分上传。但是当我使用预先指定的URL上传一个非常大的对象时会发生什么?

What was your final solution?

palmerye commented 4 years ago

I use the aws-sdk-js to handle the upload progess.

const AWS = require('aws-sdk');
const s3Client = new AWS.S3({
    accessKeyId: 'Access-Key',
    secretAccessKey: 'Secret-Key',
    endpoint: 'xx',
    s3ForcePathStyle: true,
    signatureVersion: 'v4'
});

router.post('/upload', async (ctx, next) => {
    const file = ctx.request.files.file;
    const fileStream = fs.createReadStream(file.path);
    const params = {
        Bucket: 'mybuckettest',
        ContentType: file.type,
        Key: file.name,
        Body: fileStream
    };
    const manageUpload = s3Client.upload(params, {}, (err, data) => {
        if (err) {
            console.log(err)
        }
        console.log('success==', data);
    })

    manageUpload.on('httpUploadProgress', progress => {
        console.log('progress==', progress);
    })
});

result

progress== { loaded: 5242880,
  total: 25339399,
  part: 1,
  key: 'xx.wmv' }
progress== { loaded: 10485760,
  total: 25339399,
  part: 2,
  key: 'xx.wmv' }
progress== { loaded: 15728640,
  total: 25339399,
  part: 3,
  key: 'xx.wmv' }
progress== { loaded: 20971520,
  total: 25339399,
  part: 4,
  key: 'xx.wmv' }
progress== { loaded: 25339399,
  total: 25339399,
  part: 5,
  key: 'xx.wmv' }
success== { Location:
   'http://xxx:9000/mybuckettest/xx.wmv',
  Bucket: 'mybuckettest',
  Key: 'xx.wmv',
  ETag: '30915736c98c23adcd4d6xx-5' }
KadKla commented 4 years ago

The use of aws-sdk-js works pretty nice, however, the version above uses 'fs', this is not possible in the browser. However, the sample from https://github.com/minio/minio-js/issues/687#issuecomment-383836279 works also in the browser :)

patchthecode commented 3 years ago

Any update on this?

I am having the same issue. I want to provide a presigned URL to my client app.

@kannappanr left a description to follow

I believe you should be able to use pre-signed urls to do multi-part upload. You might have to do multiple calls like PutObject Part, Complete Multipart Upload, Initiate Multipart Upload that minio-js abstracts

but i have no idea what those functions should look like.

prakashsvmx commented 3 years ago

Please explorer Assume Role API . For priority support please have a look at min.io/pricing .

patchthecode commented 3 years ago

Is simply asking this question considered priority support? 😕

At this point I simply wanted to know what this SDK was capable of before I even consider using it. After looking at the Assume Role API, i finally saw this line -->

The general client SDKs don't support multipart with presigned URLs.

This is all i needed to know. Therefore I wil build my own solution. Thanks.

shay-vanti commented 1 year ago

@patchthecode did you find/impl any solution? I'm having the same problem. Before minio, I used an s3-oriented library called evaporate js that would split the file (in front) and request a presignedUrl for each part, then uploading it to s3 one by one.

With minio it is different as I'm not using an s3 endpoint, but I couldn't figure out how to request a presignedUrl in multiparts..

patchthecode commented 1 year ago

@shay-vanti since min.io is compatible with the aws lib, i used the latest AWS client library for the GO language.