Open mauritzvdworm opened 3 years ago
Is it consistent? It may just be a passing error due to the cloud storage service being disrupted Google's end. If it is consistent, please add an example of what code is triggering it.
Hi @MarkEdmondson1234, I have a list of csv files that I upload using gcs_upload inside an lapply. It uploads the first couple files fine then crashes. The crash doesn't happen with the same file being uploaded, it is pretty random when it occurs. I'll wait a bit and try again later, maybe the issue resolves itself.
Hi everyone, here is a recent code snippet that triggers this issue
It seems quite consistent, I've been experiencing this since 2021-11-11 18:11:51 CST and it's still persistent
However, this issue seems to only affect gcs_copy_object()
any ideas ?
library(googleCloudStorageR)
library(tidyverse)
gcs_auth(json_file = "my-service-token.json")
gcs_copy_object(source_object = "iris.csv",
source_bucket = "my-bucket",
destination_object = "iris2.csv",
destination_bucket = "my-bucket")
the output message is as follows :
Request failed [503]. Retrying in 1.6 seconds... Request failed [503]. Retrying in 1 seconds... Request Status Code: 503 Trying again: 1 of 5 Trying again: 2 of 5 Trying again: 3 of 5 Trying again: 4 of 5 Trying again: 5 of 5 All attempts failed. API returned: Failed to look up stub from GcsStubs
-- Here is my sessionInfo() information
R version 3.5.0 (2018-04-23)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS High Sierra 10.13.6
Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRlapack.dylib
locale:
[1] zh_TW.UTF-8/zh_TW.UTF-8/zh_TW.UTF-8/C/zh_TW.UTF-8/zh_TW.UTF-8
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] httr_1.4.1 arrow_0.15.1.1 yaml_2.2.0 lubridate_1.7.4
[5] forcats_0.3.0 stringr_1.3.1 dplyr_0.8.3 purrr_0.2.5
[9] readr_1.1.1 tidyr_1.0.0 tibble_2.1.3 ggplot2_3.3.3
[13] tidyverse_1.2.1 bigrquery_1.2.0 googleCloudStorageR_0.6.0
loaded via a namespace (and not attached):
[1] progress_1.2.0 tidyselect_0.2.5 haven_1.1.2 gargle_1.2.0 lattice_0.20-35 colorspace_1.3-2 vctrs_0.3.8
[8] utf8_1.1.4 rlang_0.4.11 later_1.0.0 pillar_1.4.2 glue_1.4.2 withr_2.1.2 DBI_1.0.0
[15] rappdirs_0.3.3 bit64_0.9-7 dbplyr_1.4.2 readxl_1.1.0 modelr_0.1.2 lifecycle_1.0.0 cellranger_1.1.0
[22] munsell_0.5.0 gtable_0.2.0 rvest_0.3.2 zip_2.0.4 memoise_1.1.0 httpuv_1.5.2 curl_3.2
[29] fansi_0.4.0 broom_0.5.0 Rcpp_1.0.2 openssl_1.0.2 promises_1.2.0.1 backports_1.1.2 scales_1.0.0
[36] jsonlite_1.6 mime_0.6 fs_1.3.1 bit_1.1-14 googleAuthR_1.4.0 hms_0.4.2 digest_0.6.18
[43] stringi_1.2.4 grid_3.5.0 cli_3.0.1 tools_3.5.0 magrittr_1.5 crayon_1.3.4 pkgconfig_2.0.2
[50] prettyunits_1.0.2 xml2_1.2.0 assertthat_0.2.0 rstudioapi_0.10 R6_2.3.0 nlme_3.1-137 compiler_3.5.0
The copy object URL endpoint had changed so that is now fixed. Perhaps the other 503 is due to that too?
For reference the www.googleapis.com changed to "https://storage.googleapis.com/"
Any other reports for this?
Hi @MarkEdmondson1234,
I'm facing the same issue as described above. I'm using the CRAN version of googleCloudStorageR
(v. 0.7.0) and I've noticed that both the gcs_copy_object()
and gcs_delete_object()
make use of the old URL ("googleapis.com/storage/..." in place of "storage.googleapis.com/...").
Are you planning a new CRAN release to fix this issue?
Can you confirm the GitHub version fixes it?
Yes, the GitHub version of the package (v. 0.7.0.9000) runs smoothly for gcs_copy_object()
and gcs_delete_object()
.
Started getting this error today out of nowhere. Any idea what it means?