ropensci / nlrx

nlrx NetLogo R
https://docs.ropensci.org/nlrx
GNU General Public License v3.0
77 stars 12 forks source link

Help wanted about memory issues #50

Closed dorazila closed 2 years ago

dorazila commented 3 years ago

I'm trying to perform a Sobol analysis of Netlogo results, using NLRX and future. I tried to use two different windows machines (the first 96 GB and 20 Cores, the latter 32 Gb and 16 cores).

I set up the experiment and defined a "future" simulation plan. However I'm experiencing memory issues. Even with low samples (500) the memory space exhausted (but the machine has 96 GB on board.

Can you help me, please ?

CODE

Sys.setenv(JAVA_HOME = "c:/Program Files/Java/jdk-15.0.2/") netlogopath <- file.path("C:/Program Files/NetLogo 6.2.0")

modello con slider per mask filter

modelpath <- file.path("v5-4-6-6.4_SFERISTERIO.nlogo") outpath <- file.path("output")

nl <- nl(nlversion = "6.2.0", nlpath = netlogopath, modelpath = modelpath, jvmmem = 4096)

Space use

tickperact = 4 numact = 4 queuetime = 2 pausetime = 2 durata = queuetime 2 + numact tickperact + pausetime * (numact - 1)

Experiment

nl@experiment <- experiment(expname="Sferisterio", outpath=outpath, repetition=1, tickmetrics="true", idsetup="setup", idgo="go", runtime= durata, #PRIMA: 320 evalticks= durata, #seq(1, 26, by = 1), #26, # seq(2,16, by=2), # c(2,6,15), #seq(1,15, by=2),#PRIMA: 320, by 49 metrics=c("count-infected", "initial-people"), variables = list( 'initial-people' = list(min = 600, max = 750, qfun = "qunif"), 'init-infectors-percentage' = list(min = 10, max = 15, qfun = "qunif"), 'vaccinated-percentage' = list(min = 30, max = 70, qfun = "qunif"), #dato minimo al 18.4.2021 'vaccine-effectiveness' = list(min = 85, max = 95, qfun = "qunif"), #Pfizer e Moderna 94%, Astra zeneca 79% 'mask-wearing-percentage' = list(min = 90, max = 100, qfun="qunif") #così gestiamo la percentuale di persone con quelle mascherine

                        ),

                        constants = list( "form-coeff" = 0.5,
                                          "mask-filter" = 0.01,
                                          "infection-chance" = 100,
                                          "asymptomatic-ratio" = "\"DP\"",
                                          "average-delay" = 96, #PRIMA: 32
                                          "average-recovery-time" = 256,
                                          "incubation-time" = 512,
                                          "movement-at-breaks" = 56,
                                          "free-moving-agents" = 1,
                                          "tickperact" = tickperact,
                                          "numact" = numact,
                                          "queuetime" = queuetime,
                                          "pausetime" = pausetime,
                                          "moving-at-pause" = 25

                        )

)

nl@simdesign <- simdesign_sobol2007(nl = nl, samples = 1000, ##campiona le variables e metrics (per prova messo 10); sobol richiede grandi campionamenti > 1000 valori

sobolorder = 2, ##fornisce i primi 2 ordini di interazione

                                sobolnboot = 2, ## lo ripete 100 volte (per prova messo 2)
                                sobolconf = 0.95,
                                nseeds = 2,
                                precision = 3)

processi = 2* nrow(nl@simdesign@siminput) processi detectCores()

nl@simdesign@siminput plan(list(sequential,multisession)) system.time(results %<-% run_nl_all(nl = nl, split = 14 )) setsim(nl, "simoutput") <- results # si attaccano i risultati all'oggetto nl sensitivityIndices <- analyze_nl(nl) # si calcolano gli indici

Salvataggio

write.csv(file = "output/RES_SFER_1.csv", results) write.csv(file = "oputput/sobol_SFER_1.csv", sensitivityIndices)

dorazila commented 3 years ago

Thanks a lot 😊

Da: Thomas Bayley @.> Inviato: martedì 18 maggio 2021 16:45 A: ropensci/nlrx @.> Cc: dorazila @.>; Author @.> Oggetto: Re: [ropensci/nlrx] Help wanted about memory issues (#50)

By memory issues I'm assuming you mean that it's using up all your memory and running very slowly?

I'm new to this package as well but have had some experience trying to solve my own memory usage issues so it may be worth trying to drop down the jvmmem from 4096 to 1024, or 2048 if you definitely need more than 1024.

From my understanding the multisession futures plan creates separate sessions of R in which to run simulations and each of these sessions is allocated its own amount of RAM, which I imagine has to be more than the amount needed for the Java virtual machine.

A multisession plan will try and create an r session on every available core (given by your detectCores() function) so if this number multiplied by 4GB (The current jvmmem) is greater than the amount of available memory, you'll run into memory problems. I.e if detectCores() gives a number greater than 24 for your 20Core 96GB machine it will still have memory issues even though it has got lots of RAM. Hence dropping down jvmmem might help things run better.

Hope this helps!

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/ropensci/nlrx/issues/50#issuecomment-843234014 , or unsubscribe https://github.com/notifications/unsubscribe-auth/ADDSCNUFYI7PJ2LPQBPQOFDTOJ4PXANCNFSM43N7DAXA . https://github.com/notifications/beacon/ADDSCNW244KGX7EWHFA4BPTTOJ4PXA5CNFSM43N7DAXKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOGJBLVXQ.gif

Thomas-Bayley commented 3 years ago

I think I was a bit hasty responding before. Using less cores helped my sumualtaion at first but eventually it still filled up the memory again so I'm not sure what I said before will be helpful.

I tried to dele the previous comment as since it hadn't resolved my issues like I previously though I wanted to remove it to avoid causing any confusion or sending you down the wrong way. Sorry :(

On Wed, 19 May 2021, 07:14 dorazila, @.***> wrote:

Thanks a lot 😊

Da: Thomas Bayley @.> Inviato: martedì 18 maggio 2021 16:45 A: ropensci/nlrx @.> Cc: dorazila @.>; Author @.> Oggetto: Re: [ropensci/nlrx] Help wanted about memory issues (#50)

By memory issues I'm assuming you mean that it's using up all your memory and running very slowly?

I'm new to this package as well but have had some experience trying to solve my own memory usage issues so it may be worth trying to drop down the jvmmem from 4096 to 1024, or 2048 if you definitely need more than 1024.

From my understanding the multisession futures plan creates separate sessions of R in which to run simulations and each of these sessions is allocated its own amount of RAM, which I imagine has to be more than the amount needed for the Java virtual machine.

A multisession plan will try and create an r session on every available core (given by your detectCores() function) so if this number multiplied by 4GB (The current jvmmem) is greater than the amount of available memory, you'll run into memory problems. I.e if detectCores() gives a number greater than 24 for your 20Core 96GB machine it will still have memory issues even though it has got lots of RAM. Hence dropping down jvmmem might help things run better.

Hope this helps!

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub < https://github.com/ropensci/nlrx/issues/50#issuecomment-843234014> , or unsubscribe < https://github.com/notifications/unsubscribe-auth/ADDSCNUFYI7PJ2LPQBPQOFDTOJ4PXANCNFSM43N7DAXA> . < https://github.com/notifications/beacon/ADDSCNW244KGX7EWHFA4BPTTOJ4PXA5CNFSM43N7DAXKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOGJBLVXQ.gif>

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/ropensci/nlrx/issues/50#issuecomment-843777525, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALKARWIESAMVNKYVEBU7MGTTONJLBANCNFSM43N7DAXA .

bitbacchus commented 2 years ago

I understand this solved the issue? If not, please re-open :-)