PredictiveEcology / SpaDES.core

Core functionality for Spatial Discrete Event Simulation (SpaDES)
https://spades-core.predictiveecology.org/
GNU General Public License v3.0
9 stars 15 forks source link

saveSimList() hangs using qs; writing massive files #258

Open achubaty opened 1 year ago

achubaty commented 1 year ago

calling saveSimList() using the previous dev-stable versions of SpaDES.core and reproducible produced reasonable file sizes when using fileBackEnd = 2 (now deprecated):

rep01$ du -sh simOut*.qs
125M    simOutDataPrep_NRR_Cariboo.qs
1.2G    simOutPreamble_NRR_Cariboo.qs
46M simOutSpeciesLayers_NRR_Cariboo.qs

using the current development versions (with latest modules using terra) causes the R session to hang (for hours before I killed it), as it seems to be writing massive files: preamble was over 100GB and Biomass_speciesData more than 10GB in size. I am using saveSimList(sim, file, inputs = FALSE, outputs = FALSE, cache = FALSE).

I'm not sure whether this is related to https://github.com/PredictiveEcology/reproducible/issues/359, since my understanding is that saveSimList() uses some of the Cache() infrastructure to deal with file-backed R objects, etc.

It may be a qs issue -- debugging this, I manually set the filename to .rds and used saveRDS() which wrote the pramble simList to a 2 GB file.

achubaty commented 1 year ago

update: this is only an issue with qs (rds works as expected), but I haven't been able to track this down further yet