Open matton2 opened 4 years ago
I have come across this recently as well with 500 errors from PUT attempts.
Warning: Error in parse_aws_s3_response: Internal Server Error (HTTP 500).
83: <Anonymous>
82: stop
81: httr::stop_for_status
80: parse_aws_s3_response
79: s3HTTP
78: aws.s3::put_object
AWS docs suggest protecting against this and retrying and it'd be great if this could be supported through aws.s3
. Thank you for considering it.
Because Amazon S3 is a distributed service, a very small percentage of 5xx errors are expected during normal use of the service. All requests that return 5xx errors from Amazon S3 can and should be retried, so we recommend that applications making requests to Amazon S3 have a fault-tolerance mechanism to recover from these errors.
https://aws.amazon.com/premiumsupport/knowledge-center/http-5xx-errors-s3/
I eventually just wrote the thing into a try catch retry loop
getAWSData <- function(bucket) {
tryCatch(
s3read_using(FUN = read_csv,
guess_max = 5000,
object = "someObject",
bucket = bucket,
opts = list(key = "someKey",
secret = "someSecret",
region = 'us-east-1',
verbose = TRUE)
),
error = function(c) getAWSData(bucket)
)
}
Before filing an issue, please make sure you are using the latest development version which you can install using
install.packages("aws.s3",repo="https://rforge.net")
(see README) since the issue may have been fixed already. Also search existing issues first to avoid duplicates.Please specify whether your issue is about:
Every so often, I receive an access denied response from AWS for a file that I know I have access to. If I try a few minutes later, it is able to connect and GET. The same is true for PUT, sporadically, I will get a 403 error. Is it possible to add `HTTR::RETRY`` to the s3HTTP.R?