tpq / peakRAM

[EXPERIMENTAL] An R package to monitor the peak RAM used by R expressions and functions
21 stars 1 forks source link

peakRAM with parallel processing #1

Open talegari opened 7 years ago

talegari commented 7 years ago

@tpq Thanks for a handy package, I had been using a rough utility of mine for the same: https://gist.githubusercontent.com/talegari/ad06da7795b8771e2e152f304ca00f6f/raw

Do you have an idea to compute the peak RAM when multiple cores are used when a sock/fork cluster is instantiated?

tpq commented 7 years ago

@talegari Thank you for your interest in peakRAM! Admittedly, anything to do with parallelizing R really pushes the limits of my knowledge. However, I'm happy to give this problem a think-over. Do you happen to have a simple piece of reproducible code I could try out?

I wonder whether something like the following pseudo-code could work. I assume the garbage collector will detect RAM use regardless of the number of R processes distributed? Maybe not, though...

makeCluster <- function(){
  # make cluster
  # deliver jobs across multiple cores
  # close cluster
}

peakRAM::peakRAM(makeCluster())
plantton commented 3 years ago

Since this issue is still opening:

I notice that my R codes actually uses almost 15 GB of RAM, by HTOP. Meanwhile the recording RAM usage is merely 300 MB by peakRAM. The code block uses several functions from BiocParallel. I'm still checking whether the RAM issue was caused by parallel methods, or just because some methods cannot be monitored by peakRAM.