Open ark- opened 3 years ago
Of late I don't use or maintain this package. If you can work out the issue and fix the code with a PR it'll be welcome. I apologize I can't offer more than that. Please let me know if this turns out to be a bug in this code rather than somewhere upstream such as the S3 throttling you mentioned above.
I'm getting the following exception when using
Where
desination_folder
is a path to a folder that exists andimage_ids_to_get
is a list of IDs, e.g.["2a1d31d9e9bd6c85","2b8009fb25d3403e"]
If I start again where I left off (omitting IDs already downloaded) it will continue for another 100-900 images and fail again. If I continually re run my script all images will eventually download. So that means they all exist in S3.
Is S3 rate limiting? Has there been a breaking boto update?