OceanParcels / Parcels

Main code for Parcels (Probably A Really Computationally Efficient Lagrangian Simulator)
https://www.oceanparcels.org
MIT License
295 stars 136 forks source link

Adding an additional 3d variable (temperature) from NEMO #766

Closed erikbehrens closed 4 years ago

erikbehrens commented 4 years ago

Hey, I define my fieldset as follows:

data_path = '/home/behrense/_niwa02764n/data/' ufiles = sorted(glob(data_path+'1U.nc')) vfiles = sorted(glob(data_path+'1V.nc')) tfiles = sorted(glob(data_path+'1*T.nc'))

mesh_mask = data_path + '1_mesh_mask.nc'

filenames = {'U': {'lon': mesh_mask, 'lat': mesh_mask,'depth': tfiles[0], 'data': ufiles}, 'V': {'lon': mesh_mask, 'lat': mesh_mask,'depth': tfiles[0], 'data': vfiles}, 'ice_u': {'lon': mesh_mask, 'lat': mesh_mask, 'data': tfiles}, 'ice_v': {'lon': mesh_mask, 'lat': mesh_mask, 'data': tfiles}, 'ice': {'lon': mesh_mask, 'lat': mesh_mask, 'data': tfiles}, 'temp': {'lon': mesh_mask, 'lat': mesh_mask,'depth': tfiles[0], 'data': tfiles}}

variables = {'U': 'vozocrtx', 'V': 'vomecrty' , 'ice_u': 'sea_ice_x_velocity', 'ice_v': 'sea_ice_y_velocity', 'ice': 'soicecov', 'temp': 'votemper'}

dimensions = {'U': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'}, 'V': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'}, 'ice_u': {'lon': 'glamf', 'lat': 'gphif', 'time': 'time_counter'}, 'ice_v': {'lon': 'glamf', 'lat': 'gphif', 'time': 'time_counter'}, 'ice': {'lon': 'glamf', 'lat': 'gphif', 'time': 'time_counter'}, 'temp': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'}} fieldset = FieldSet.from_nemo(filenames, variables, dimensions)

and run

for year in range(2002,2003):

pset = ParticleSet.from_list(fieldset, pclass=JITParticle,lon=lon_vector,lat=lat_vector,time=time_vector,depth=depth_vector)

pfile = ParticleFile("nemo_particles_"+str(year)+'a.nc', pset, outputdt=timedelta(days=5))

kernels = pset.Kernel(AdvectionRK4)
pset.execute(kernels, runtime=timedelta(days=90), dt=timedelta(hours=12),output_file=pfile,recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})
pfile.export()

but that results in following error: INFO: Compiled JITParticleAdvectionRK4 ==> /tmp/parcels-5765/c937e7fdabb431ee8ec3a27a97dc8b78_0.so INFO: Unable to locate chunking hints from dask, thus estimating the max. chunk size heuristically. Please consider defining the 'chunk-size' for 'array' in your local dask configuration file (see http://oceanparcels.org/faq.html#field_chunking_config and https://docs.dask.org). Traceback (most recent call last): File "ROAM-NEB004_sea_ice_velocities_vertical_velo_3y_test.py", line 211, in pset.execute(kernels, runtime=timedelta(days=90), dt=timedelta(hours=12),output_file=pfile,recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle}) File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/parcels/particleset.py", line 527, in execute self.kernel.execute(self, endtime=time, dt=dt, recovery=recovery, output_file=output_file) File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/parcels/kernel.py", line 341, in execute self.execute_jit(pset, endtime, dt) File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/parcels/kernel.py", line 233, in execute_jit f.chunk_data() File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/parcels/field.py", line 937, in chunk_data self.data_chunks[block_id] = None IndexError: list assignment index out of range Exception ignored in: <bound method ParticleFile.del of <parcels.particlefile.ParticleFile object at 0x2aaaf9244198>> Traceback (most recent call last): File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/parcels/particlefile.py", line 195, in del File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/parcels/particlefile.py", line 200, in close File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/parcels/particlefile.py", line 359, in export File "/home/behrense/.conda/envs/py3_parcels/lib/python3.6/site-packages/numpy/lib/npyio.py", line 428, in load NameError: name 'open' is not defined

That only happens if I include 3d temperature variable. netcdf \1_ROAM15-NEB004_5d_20020101_20021231_grid_T { dimensions: y = 579 ; x = 704 ; nvertex = 4 ; deptht = 75 ; axis_nbounds = 2 ; time_counter = UNLIMITED ; // (73 currently) variables: float votemper(time_counter, deptht, y, x) ; votemper:standard_name = "sea_water_potential_temperature" ; votemper:long_name = "temperature" ; votemper:units = "degC" ; votemper:online_operation = "instant" ; votemper:interval_operation = "1 d" ; votemper:interval_write = "5 d" ; votemper:cell_methods = "time: point (interval: 1 d)" ; votemper:cell_measures = "area: area" ; votemper:_FillValue = 1.e+20f ; votemper:missing_value = 1.e+20f ; votemper:coordinates = "time_instant deptht nav_lon nav_lat" ; .....

I assume the chunking of the temperature variable differes to U,V for unknown reasons. Any insights? Best regards Erik

PS: any clues why the NameError: name 'open' is not defined appears PPS: I am using parcel 2.1.4

erikvansebille commented 4 years ago

Hi @erikbehrens. Thanks for your question. @CKehl has been revising and updating the field_chunking quite a bit the last few days, and there's a new working branch at #764 (to be merged into master soon). Could you check if your code works with that branch?

erikvansebille commented 4 years ago

Dear @erikbehrens, have you checked whether #764 fixes your Issue? It is now in master, so you should just be able to pull that branch

PS: The NameError: name 'open' is an annoying issue with python that cannot delete/close the ParticleFile anymore at the end of a simulation if the simulation is prematurely terminated. A solution around this is to create the ParticleFile object inside the pset.execute() call

CKehl commented 4 years ago

Hi! Two comments - point 1) I saw the error - somehow the auto-deduction of the dimensions of the NetCDF failed. That resulted in a lack of information for the opening chunking, a skip of the chunk setup, which in the end lead to the AccessViolationError. That is generally fixed, but looking at the ncdump -h output you provided (thank you for that!) I see that in the next round of keyword search extensions I need to add deptht to the parsing list. Proposed solution: a) pull the new master b) run your script as follows:

data_path = '/home/behrense/_niwa02764n/data/'
ufiles = sorted(glob(data_path+'1U.nc'))
vfiles = sorted(glob(data_path+'1V.nc'))
tfiles = sorted(glob(data_path+'1*T.nc'))

mesh_mask = data_path + '1_mesh_mask.nc'

filenames = {'U': {'lon': mesh_mask, 'lat': mesh_mask,'depth': tfiles[0], 'data': ufiles},
'V': {'lon': mesh_mask, 'lat': mesh_mask,'depth': tfiles[0], 'data': vfiles},
'ice_u': {'lon': mesh_mask, 'lat': mesh_mask, 'data': tfiles},
'ice_v': {'lon': mesh_mask, 'lat': mesh_mask, 'data': tfiles},
'ice': {'lon': mesh_mask, 'lat': mesh_mask, 'data': tfiles},
'temp': {'lon': mesh_mask, 'lat': mesh_mask,'depth': tfiles[0], 'data': tfiles}}

variables = {'U': 'vozocrtx',
'V': 'vomecrty' ,
'ice_u': 'sea_ice_x_velocity',
'ice_v': 'sea_ice_y_velocity',
'ice': 'soicecov',
'temp': 'votemper'}

dimensions = {'U': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'},
'V': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'},
'ice_u': {'lon': 'glamf', 'lat': 'gphif', 'time': 'time_counter'},
'ice_v': {'lon': 'glamf', 'lat': 'gphif', 'time': 'time_counter'},
'ice': {'lon': 'glamf', 'lat': 'gphif', 'time': 'time_counter'},
'temp': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'}}
fieldset = FieldSet.from_nemo(filenames, variables, dimensions, field_chunksize={'x': 64, 'y': 64, 'deptht': 25, 'time_counter': 1})

pset = ParticleSet.from_list(fieldset, pclass=JITParticle,lon=lon_vector,lat=lat_vector,time=time_vector,depth=depth_vector)

pfile = ParticleFile("nemo_particles_"+str(year)+'a.nc', pset, outputdt=timedelta(days=5))

kernels = pset.Kernel(AdvectionRK4)
pset.execute(kernels, runtime=timedelta(days=90), dt=timedelta(hours=12),output_file=pfile,recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})
pfile.export()

point 2) on the open process, you can try to modify the lines

pfile = ParticleFile("nemo_particles_"+str(year)+'a.nc', pset, outputdt=timedelta(days=5))

kernels = pset.Kernel(AdvectionRK4)
pset.execute(kernels, runtime=timedelta(days=90), dt=timedelta(hours=12),output_file=pfile,recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})
pfile.export()

to

kernels = pset.Kernel(AdvectionRK4)
with ParticleFile("nemo_particles_"+str(year)+'a.nc', pset, outputdt=timedelta(days=5)) as pfile:
    pset.execute(kernels, runtime=timedelta(days=90), dt=timedelta(hours=12),output_file=pfile,recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})
    pfile.export()

The with statement should act as a guard to automatically close the file no matter what happens inside the statement. Alternatively, you can try

pfile = ParticleFile("nemo_particles_"+str(year)+'a.nc', pset, outputdt=timedelta(days=5))
kernels = pset.Kernel(AdvectionRK4)
try:
    pset.execute(kernels, runtime=timedelta(days=90), dt=timedelta(hours=12),output_file=pfile,recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})
    pfile.export()
except:
    pfile.close()

Hope it helps!

erikvansebille commented 4 years ago

Closing this Issue for now, as it is likely that https://github.com/OceanParcels/parcels/pull/764 has fixed this Issue