aws-amplify / amplify-cli

The AWS Amplify CLI is a toolchain for simplifying serverless web and mobile development.
Apache License 2.0
2.83k stars 822 forks source link

How can I limit the amount of megabytes my users upload to S3? #5866

Open Ricardo1980 opened 4 years ago

Ricardo1980 commented 4 years ago

My project uses S3 and Cognito, among other AWS services. The users can upload their pictures to S3, and other users can see them.

My question is, what about if I user (actually, an attacker), keeps uploading more and more MB of data on purpose? Is there any way to limit this? What should I do to cope with this scenario? Is it possible to setup an alarm in cloud watch just in case?

Thanks a lot for suggestions.

Which Category is your question related to? Storage

Amplify CLI Version 4.32.1

What AWS Services are you utilizing? S3, cognito

Provide additional details e.g. code snippets NA

UnleashedMind commented 4 years ago

I don't think this is provided out of the box, you will have to implement your own mechanism for it. You can list all the items by a particular user, get the total size, and based on the result your app can take appropriate actions such as stop uploading user files if the total size reached certain limit.

Ricardo1980 commented 4 years ago

@UnleashedMind Thanks for reply. Isn't this a potential security issue? Do I have other options? Thanks!

davidbiller commented 4 years ago

Just update the user table after last upload. FreeSpace: 1200000

When the user wants upload the next file, check if the file is smaller.

Keep it simple.

Ricardo1980 commented 4 years ago

@davidbiller There is no db involved or API/lambda. I am just using cognito and S3 from the iOS framework. To upload files I use: AWSS3TransferUtility.default().uploadData

davidbiller commented 4 years ago

@davidbiller There is no db involved or API/lambda. I am just using cognito and S3 from the iOS framework. To upload files I use: AWSS3TransferUtility.default().uploadData

You could use a custom cognitio attribute instead of a DB. Dirty, but good for this setting

Ricardo1980 commented 4 years ago

@davidbiller Thanks. I don't know that trick. Can you explain me the logic involved? Thanks a lot!

davidbiller commented 4 years ago

So you use cognito user accounts Right?

Ricardo1980 commented 4 years ago

Yes!

davidbiller commented 4 years ago

Yes!

https://docs.aws.amazon.com/cognito/latest/developerguide/user-pool-settings-attributes.html

UnleashedMind commented 4 years ago

I will mark this as a feature request

Ricardo1980 commented 4 years ago

@davidbiller can you elaborate your approach, please? do you mean something like checking MB already uploaded and save that in cognito user attributes? what about if the user (hacker) modifies the app to avoid or bypass when I check this attribute before uploading? Thanks a lot!

alokrajiv commented 3 years ago

@davidbiller are you recommending doing this check at the client side? That would not be very secure right? I mean a post upload lambda trigger could check - but its only after the fact right? In Firebase, I have seen validation and other security rules that can be added including file size checks (though not sure it can be across all uploads of a user). I just started exploring Amplify, but if I am not wrong: Storage is internally just leveraging IAM to do this without a resolver in between.

biller-aivy commented 3 years ago

@davidbiller are you recommending doing this check at the client side? That would not be very secure right? I mean a post upload lambda trigger could check - but its only after the fact right? In Firebase, I have seen validation and other security rules that can be added including file size checks (though not sure it can be across all uploads of a user). I just started exploring Amplify, but if I am not wrong: Storage is internally just leveraging IAM to do this without a resolver in between.

You could check the Size at frontend, than send to backend the size, if its ok, allow upload. This would be the safe option.

AdrianRamzes commented 3 years ago

@biller-aivy

You could check the Size at frontend, than send to backend the size, if its ok, allow upload. This would be the safe option.

I don't think that would solve the problem... A client can still send a request with allowed size value and then upload a binary file much bigger than declared, which will drive up the S3 cost for the app owner and there is no mechanism to prevent this.

biller-aivy commented 3 years ago

Custom lambda? Check size there and upload it to s3 from lambda. Only the traffic would be a problem than.

AdrianRamzes commented 3 years ago

@biller-aivy

[...] upload it to s3 from lambda.

Could you explain it? (Lambda has a payload limit of 256KB) https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html

alokrajiv commented 3 years ago

So, if I was doing this without amplify- I would just create an S3 signed URL from a "trusted code" execution - server side, maybe a lambda which can enforce a content-length-range using a POST policy. Then the client can use this url to upload.

(Upload to the lambda directly of the file as mentioned in Preci comment wouldn't work because even if sync lambda execution is used, the limit is maxed out at 6MB and if you use API gateway its limit hits too)

But, then again if I go with this custom approach I am not leveraging Amplify - so doing storage outside sounds counter intuitive to me. Again I'm new to Amplify, so I could be wrong to that perspective.