soapiestwaffles / s3-nuke

Nuke all the files and their versions from an S3 Bucket 💣🪣
MIT License
19 stars 1 forks source link

Implement rate limiting retryer #23

Closed mikelorant closed 1 month ago

mikelorant commented 1 month ago

When receiving a rate limit response from AWS, the application would exit with an error. This was an undesirable outcome.

AWS has a feature to implement retrying using their retry package.

By default there are a number of retryable conditions. The relevant useful cases are as follows:

This means the default retryer will handle rate limiting and will not need to implicitly handle this case.

It is also important to note the following about client rate limiting:

Generally you will always want to return new instance of a Retryer. This will avoid a global rate limit bucket being shared across all service clients.

This means that the instantiation of the S3 client must be handled within each Go routine and cannot be shared as it was previously implemented.

AdaptiveMode is the retry strategy that will be used:

AdaptiveMode provides an experimental retry strategy that expands on the Standard retry strategy, adding client attempt rate limits. The attempt rate limit is initially unrestricted, but becomes restricted when the attempt fails with for a throttle error.

The default, values for the AdaptiveMode is based on the NewStandard which is:

Fixes: #21