OceanParcels / Parcels

Main code for Parcels (Probably A Really Computationally Efficient Lagrangian Simulator)
https://www.oceanparcels.org
MIT License
295 stars 136 forks source link

memory error when deferred_load = False in fieldset #749

Closed ignasivalles closed 4 years ago

ignasivalles commented 4 years ago

Hi! I had similar problem than #658. Following @delandmeterp suggestions I set deferred_load=False . A short simulation was working for parcels version 2.1.2 but not for version 2.1.4, where fieldset crashes with MemoryError. I tried to use field_chunksize to auto, False and a number without success.

So my question is if there is somehow a better way to solve the calendar issue without scarifying the deferred_load option.

Thank you!

erikvansebille commented 4 years ago

Have you tried using the timestamps keyword argument, as explained in https://nbviewer.jupyter.org/github/OceanParcels/parcels/blob/master/parcels/examples/tutorial_timestamps.ipynb? Does that fix this issue?

ignasivalles commented 4 years ago

Thanks Erik!, It works but I had to split my netcdf files into single time-step files to make it work. But now some another errors appeared again (#712) : recovery_kernel = recovery_map[p.state]; KeyError: 2.

First I added recovery kernel as you suggested in #737, only for ErrorCode.Error:

def DeleteParticle(particle, fieldset, time):
    particle.delete()
pset.execute(..., recovery={ErrorCode.Error: DeleteParticle}))

Then I had a ThroughSurfaceError with Through-surface sampling by particle at (-7.472253, -4.896896, 21.116211). Note that depth value is 21.11m which is obviously not at the surface.

Then I set recovery={ErrorCode.Error: DeleteParticle, ErrorCode.ErrorOutOfBounds: DeleteParticle}) and the same recovery kernel error appears.

This is always happening in the first time-step of my simulation.

CKehl commented 4 years ago

we're working on the KeyError bug right now - we keep you informed when it's solved.