aws-amplify / amplify-js

A declarative JavaScript library for application development using cloud services.
https://docs.amplify.aws/lib/q/platform/js
Apache License 2.0
9.42k stars 2.12k forks source link

Upload MULTIPLE images to S3 Bucket from Amplify app (React) #10938

Open atef-abdellatif opened 1 year ago

atef-abdellatif commented 1 year ago

Before opening, please confirm:

App Id

db4nz94zrbxbj

AWS Region

us-east-1

Amplify Hosting feature

Not Applicable

Question

Hi,

First of all, I do apologize if this question is not relevant. I'm hosting a React app on AWS Amplify and added S3 storage. I'm trying to upload multiple images to S3 bucket at once from my app (instead of uploading image by image). I searched for a way on AWS Amplify Storage documentation but couldn't find any answer.

I'm using a dynamic form in my code to be able to add multiple file input fields. When console log the results of the form submit, I get a list of JSON objects that looks like this: [{... other fields in the form}, {image: image1, alt: alt1}, {image: image2, alt: alt2}, {image: image3, alt: alt3}]. I want to upload all of the images at once to S3.

I'm using GraphQL via AppSync. I truly appreciate any help!

Screenshot 2023-02-05 at 11 00 44 AM
hdworld11 commented 1 year ago

Hello @atef-abdellatif, thank you for opening this issue! This is very relevant to the APIs and Developer Experience for Storage. We are taking this into consideration for our roadmap and will update this issue on any progress.

nadetastic commented 1 year ago

@atef-abdellatif Adding to the above, since Amplify Storage currently supports putting one object at a time with Storage.put, a work around could be to loop through the call based on the number of objects. Here's an example of how you could implement this:

Ex:


// Array of files that has the "key" and "content" attributes needed by Storage.put(key,content), reference at [1] below
const files = [
    {key, content,...otherAttributes},
    {key, content,...otherAttributes},
    ...
];

// wrapper for Storage.put that takes in each object from the array
const putWrapper = async (object) => {
    try {
      const res = await Storage.put(object.key,object.text,...additionalAttributes)
    }catch(e){
      console.error(e)
    }
  }

// handler that can be triggered to loop through the "files" array and invoke "putWrapper"
  const handler = async () => {
    for (let index = 0; index < files.length; index++) {
      await putWrapper(files[index])
    }
  }

[1] https://docs.amplify.aws/lib/storage/upload/q/platform/js/#upload-files

atef-abdellatif commented 1 year ago

Hi there,

Thank you so very much for the work around. It worked. You saved my life. I just noticed that an “undefined” object was auto created after my first upload (screenshot attached).

I would very much appreciate it if you can tell me how to upload files to specific buckets and/or folders ( and create buckets and/or folders if they don’t already exist). I’m building a web app for a big travel agency and I would like to store images for each trip and hotel in a separate folder and sometime a need rises to store files in separate buckets (hotels bucket, tours bucket, etc). At the moment all uploaded objects are pushed automatically to the “public” folder that was auto created when I first added Storage to my app (via command “amplify add storage” in the Terminal).

Best, Atef

On Feb 8, 2023, at 10:03 PM, Dan Kiuna @.***> wrote:

@atef-abdellatif https://github.com/atef-abdellatif Adding to the above, since Amplify Storage currently supports putting one object at a time with Storage.put, a work around could be to loop through the call based on the number of objects. Here's an example of how you could implement this:

Ex:

// Array of files that has the "key" and "content" attributes needed by Storage.put(key,content), reference at [1] below const files = [ {key, content,...otherAttributes}, {key, content,...otherAttributes}, ... ];

// wrapper for Storage.put that takes in each object from the array const putWrapper = async (object) => { try { const res = await Storage.put(object.key,object.text,...additionalAttributes) }catch(e){ console.error(e) } }

// handler that can be triggered to loop through the "files" array and invoke "putWrapper" const handler = async () => { for (let index = 0; index < files.length; index++) { await putWrapper(files[index]) } } [1] https://docs.amplify.aws/lib/storage/upload/q/platform/js/#upload-files

— Reply to this email directly, view it on GitHub https://github.com/aws-amplify/amplify-js/issues/10938#issuecomment-1423234895, or unsubscribe https://github.com/notifications/unsubscribe-auth/A2HQUIYAKIOS7DJ3Y3OH2O3WWQCZTANCNFSM6AAAAAAUUMF7OY. You are receiving this because you were mentioned.

nadetastic commented 1 year ago

Hi @atef-abdellatif

Glad that the work around worked for you! Regarding your follow up questions, note that Amplify currently supports using one S3 bucket at a time - so multiple buckets would not work directly.

However in order to store objects in sub folders, you can include a path with your key when running Storage.put. For example, to have a folder named "documents", you would do:

await Storage.put('documents/foo.txt','This is sample text')

This would place the "foo.txt" file in the "public/documents/" folder.

Additionally, I would recommend taking a look at the link at [1] below which discusses various access levels for public, private, and protected.

[1] https://docs.amplify.aws/lib/storage/configureaccess/q/platform/js/

ashika112 commented 1 month ago

@atef-abdellatif The feature to upload images to specific buckets is now enable in Amplify v6(version 6.5.1). More information available here.

Does this close this feature request? I also see there is uploading multiple images to S3, the suggestion provided in this comment would be the best way to achieve that. Unfortunately S3 does not support batch upload.