Open ssvmvss opened 5 years ago
Hey @ssvmvss, I love the idea of cleaning up S3 for files that are unneeded! The hard part is that I'm not exactly sure how we should implement this. There are some aspects that this would have to handle:
We might be able to develop a new versioning or hashing URL structure to get around this then be able to delete things more easily. Here are some ideas:
static.buffer.com/<project>/<git-hash>/path/to/file.js
static.buffer.com/analyze/ 8b2836d/js/bundle.js
Environment=production
and/or a Git release when the file was uploadedAs for short-term actions, I've just added a Name: static.buffer.com
tag to that S3 bucket to help track our usage in that bucket and see how much the storage is costing us! 😄
@ssvmvss, do you have any ideas on how we could implement what you're talking about in a simpler way? Any ideas from your previous experience doing things like this?
Cross referencing this thread:
The static.buffer.com bucket with our our bundles only cost $83 in August compared to our total cost of $21k for s3 that month.
While We should clean things up, this isn't a huge a cost so it might be worth the time right now.
Purpose
Bundled files that are being uploaded by this are never getting removed from AWS taking a lot of space. Could we add some kind of worker that takes care of removing the ones that get old?