tylerhall / Autosmush

Losslessly compresses images in your Amazon S3 buckets on-the-fly.
http://clickontyler.com/blog/2010/10/automatically-compressing-your-amazon-s3-images-using-yahoos-smush-it-service/
121 stars 23 forks source link

Buckets with >1000 files #8

Closed dprvig closed 13 years ago

dprvig commented 13 years ago

Hey Tyler,

I tweeted you earlier today about a problem I was having with autosmush stopping after 1000 files (http://twitter.com/dprvig/status/25229900998778880). I looked into it and it turns out that list_objects only returns the first 1000 files unless there is a marker attached. I ended up just using get_object_list to get a list of every file I have in my bucket, well over 1000. Then I used the same loop you created to smush each image. I haven't had any issues with this method so far and have been able to smush my images pass the 1000 limit.

I've also implemented the temp_file removal process that was brought up here: https://github.com/tylerhall/Autosmush/issues#issue/7

Hope this makes the cut :)

Thanks, -Josh

tylerhall commented 13 years ago

Great work! I've merged in your changes.