Closed ChristinaB closed 7 years ago
@ChristinaB Can you verify that this will work on your local computer (i.e. iterations >= 1000)? The NCSA machine has 192Gb of RAM so it's unlikely that the server ran out of memory...unless lots of people were using the system (which is possible). I'll try running the notebook an let you know if I encounter the same issue.
@ChristinaB Well, I stand corrected. Your notebook holds steady at about 5GB of memory consumption until the following command.
LS_prob.calculate_landslide_probability()
This begins a phase of rapid memory consumption (last I checked ~50GB) before ultimately raising and MemoryException
. It appears to be some sort of memory leak happening inside the LandLab code, and I recommend you contact the LandLab development team to work on a fix.
FYI, in the future users will be limited to much smaller amount of memory, probably 4-8Gb.
@Castronova @ddcamiu I think this error is due to a memory limit, and am wondering if this is something that could be expanded?
To recreate this error, go to https://www.hydroshare.org/resource/07a4ed3b9a984a2fa98901dcb6751954/ Open the notebook called NOCA_runGMDpaper_LandlabLandslide_18May17.ipynb Section 6 sets the number of model iterations for a simulation. Demonstration sizes run quickly and without error for iterations =< 500. We need to run for iterations = 3000, but get the memory error at iterations =1000.
Is this something we can solve in our code ? or do we need expanded memory on the server to run this many iterations?
Thanks!!!