Question:
I’m writing an application which will run on AWS. It contains user uploads like images and videos and I want to store that data in S3. I’m wondering how to prevent one user uploading gigabytes of data as I will pay for it. I can:
- limit single file upload size (let’s say 50MB, as videos are allowed)
- limit amount of file uploads (let’s say 1000, just an arbitrary number)
But if someone wants, he could make several accounts and fill my storage with trash. I can also check my bucket size, before every upload, and only upload if my bucket volume < 50GB for example. But these computations are very expensive for some simple uploads.
Should I change to another storage service where I can put a limit?
Or is there a common way to solve my issue?
Or should I just trust my users?
Answer:
Without knowing your use case in detail, it is difficult to recommend a solution. There is no one solution, but a combination of solutions:
- Make your bucket private – Don’t let anyone upload it directly, instead generate a signed S3 URL for each request with a very short expiration (say 5 mins) and let the user upload his/her image with the signed URL
- Use AWS Lambda (very cheap) to monitor your bucket
PutObject
. The way it works is aPutObject
in your bucket will trigger a lambda function which receives some information about the object including size and IP address. You can write some simple Python/node.js/Java application to track and store the size and IP in some DB (either a microRedis
orDynamoDB
). If you see too many uploads or large upload from a particular IP, generate aIAM
policy to dynamically block that IP, attach it to your bucket and send anSES
email to you. - Use CloudWatch – You can have CloudWatch send alerts if
BucketSize
orObjectCount
exceeds a limit - Though I haven’t used, you can set AWS Billing Alerts so that you know as soon as possible if your billing exceeds a preset threshold instead of getting surprised at the end of the billing cycle