Closed gaborcsardi closed 4 years ago
A not very small reprex.
refs <- c("vimc/orderly@master", "vimc/vaultr@master", "vimc/orderlyweb",
"vimc/orderly.rstudio", "vimc/orderly.sharepoint",
"reconhub/projections@release",
"mrc-ide/sircovid@1bd9719a140fed14fc1009cc90d25488a9743f8f",
"mrc-ide/dde", "mrc-ide/squire", "mrc-ide/odin.js", "mrc-ide/odin",
"reside-ic/pointr", "richfitz/sowsear", "mrc-ide/pika",
"reside-ic/fstorr",
"mrc-ide/hermione", "ImperialCollegeLondon/epidemia@6e5c800",
"mrc-ide/dust", "mrc-ide/mcstate", "mrc-ide/odin.dust",
"mrc-ide/sircovid2"
)
pkgcache::pkg_cache_delete_files()
lib <- tempfile()
prop <- pkgdepends::new_pkg_installation_proposal(
refs, config = list(library = lib))
prop$solve()
prop$download()
dl <- prop$get_downloads()
Filter(function(x) !is.null(x), dl$download_error)[[1]]
message = "Operation too slow. Less than 100 bytes/sec transferred the last 30 seconds"
So this is my stupid download limit of course.
When downloading a lot of files, it does not make much sense to have limits for the single files. The limit should be on the combined downloads instead, and the default should be much be much more relaxed, too.
Seen first at https://github.com/ncov-ic/drat2/runs/1058245772#step:5:128
My guess is that we open too many HTTP connections and this causes issues. I could reproduce it locally, working on a "smaller" reprex now.