a-bouchat / ice-tracker-deformations

This project aims at computing arctic sea ice deformations from icetracker data (Sentinel-1 and RCM).
0 stars 0 forks source link

Reducing memory during computations #37

Open mathieuslplante opened 1 year ago

mathieuslplante commented 1 year ago

We are currently not able to produce the dataset with the new data from Alex Komarov, as it takes too much memory and breaks the SSH on dumbo.

We need to improve the code to make it faster. For instance, I believe we could treat and make a netcdf for each time-interval individually, then decide if we merge the netcdfs into a single data set.

dringeis commented 1 year ago

I guess it is because we are loading the data in order to fill the netcdf. We have to keep all the data loaded into memory to do that, This is why it might be so large. We need to keep several fields each timestep. And at some point, poof the elephant, ça déborde par les oreilles.

crunch has 128GB of memory, dumbo 'only' 70. That could help! =)

mathieuslplante commented 1 year ago

I guess we should try to find a way that we are not dealing with all this memory while we compute the deformations...

mathieuslplante commented 10 months ago

I changed the code so that we loop in a daily manner, although we still upload all the data to make the files. I get a memory error after running about a year of data, so I guess that there is still some memory accumulation somewhere...