Closed wlandau closed 1 year ago
All the API requests that use googleAuthR::gar_api_generator()
should retry with the logic here:
if(grepl("^5|429|408",status_code)){
try_attempts <- getOption("googleAuthR.tryAttempts")
for(i in 1:try_attempts){
myMessage("Trying again: ", i, " of ", try_attempts, level = 3)
Sys.sleep((2 ^ i) + stats::runif(n = 1, min = 0, max = 1))
the_request <- try(f)
status_code <- as.character(the_request$status_code)
if(grepl("^20",status_code)) break
}
myMessage("All attempts failed.", level = 3)
} else {
myMessage("No retry attempted: ", error, level = 2)
}
If not, an "http_500" error class to catch all the 5xx errors would be helpful.
abort_http <- function(status_code, msg = NULL){
myMessage("Custom error", status_code, msg, level = 2)
rlang::abort(paste0("http_",status_code),
message = paste0("http_", status_code, " ", msg)
)
}
This should already be the case too via https://github.com/MarkEdmondson1234/googleAuthR/blob/2cab8537c2eb2446fea39c51ea619dcdebe54b10/R/utility.R#L2
Amazing, thanks!
A follow-up: just curious, do you use exponential backoff or jitter?
Bit of both? Exponentially increasing with a random factor
Thanks. Sorry, it appears I didn't have a good look at the retry loop you showed!
I am working on https://github.com/ropensci/targets/issues/1122, and I am wondering which functions in
googleCloudStorageR
already support automatic retries for HTTP 500+ errors. (From #180, it looks like copying does.)In
targets
, I use these functions:gcs_auth()
gcs_get_global_bucket()
gcs_get_object()
gcs_metadata_object()
gcs_delete_object()
gcs_upload()
From https://github.com/paws-r/paws/issues/520#issuecomment-1217416070, it looks like AWS recommends retrying on 5xx errors. If this were implemented natively in the functions above, that would be amazing. If not, an "http_500" error class to catch all the 5xx errors would be helpful.