Open abelew opened 9 months ago
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.
I think I replied via email, but the following is an example more similar to what happens in my documents and shows that you only need ~ 30 plots (at least in my hands) to hit a duplicate tmpnam:
library(parallel)
library(doParallel)
library(iterators)
library(foreach)
iterators <- 10
returns <- c()
## I haven't used parallel in a while, I can't remember if it is like make -j and I can't remember if returns or res gets the result back...
cl <- parallel::makeCluster(iterators + 1)
registered <- doParallel::registerDoParallel(cl)
res <- foreach(c = 1:(iterators * 10)) %dopar% {
filename <- tempfile()
returns <- c(returns, filename)
}
stopped <- parallel::stopCluster(cl)
table(duplicated(unlist(res)))
Sorry but I don't quite understand the code since I'm not familiar with these packages. In particular, I don't understand what returns <- c(returns, filename)
is trying to do. sapply(res, duplicated)
is all FALSE
, and I don't know what duplicated(unlist(res))
implies.
Anyway, it will be clearer if you can show a minimal reproducible example that actually uses knitr.
Greetings, When knitting larger documents with many images I sometimes get tempfile errors saying it ran out of files. When I looked more closely, it seemed to me that tempfile() was just not trying very hard; so I hacked a quick md5 tempfile generator which in theory takes all the same arguments as base::tempfile(). I was thinking to make it more robust and use some actual pseudorandomly generated material, but this seems to have worked fine for all of my largest and most troublesome documents.