pacific-hake / pacifichakemse

A Management Strategy Evaluation for Pacific Hake
3 stars 2 forks source link

exceeding vector memory with 1000 iteration OM-only runs #5

Closed kristinmarshall-NOAA closed 3 years ago

kristinmarshall-NOAA commented 3 years ago

Just flagging that plotting the bias adjustment OM-only runs with 1000 iterations gave me some vector memory errors.

overwrite_rds <- FALSE ps <- create_plot_objects(scenarios = c("biasadjust"), om_only = c(TRUE, FALSE, FALSE, FALSE), main_results_dir = "results", overwrite_rds = overwrite_rds, short_term_yrs = 2018:2027, long_term_yrs = 2027) Error: vector memory exhausted (limit reached?) Error during wrapup: vector memory exhausted (limit reached?)

This might be mac-specific. I was able to get around it by increasing the max size of a R vector as described here.

I think I remember that Nis's code did not save all the output generated, probably for this reason. But I'd need to go back and look to see what exactly was saved and what wasn't. I also wonder if saving/writing all the output in this new version of the code could be contributing to slower run times.

I don't know if anything needs to be done urgently on this, but at least there will be a record for when I forget what I did to make the plotting code run.

cgrandin commented 3 years ago

If you've set the value in the .Renviron file you should not have the issue again until you update R, get a new machine, etc.