OceanParcels / Parcels

Main code for Parcels (Probably A Really Computationally Efficient Lagrangian Simulator)
https://www.oceanparcels.org
MIT License
289 stars 127 forks source link

Error when using multiple density files #571

Closed mjmuller94 closed 5 years ago

mjmuller94 commented 5 years ago

Hello,

I have calculated the density from the NEMO data and wrote this into a NetCDF. I created a field and added it to my fieldset. I want to know the density of the particle so I defined a new particle class for the density. When I use only one file of the density and set allow_time_extrapolation=True, everything works fine. However, I get an error when I set allow_time_extrapolation=False and use multiple files. This is when the particleset is constructed.

What is going wrong here?

My code and the complete error is below and the density files are in my scratch: /scratch/Maarten/RHO_data_025/.

If is use velocity files instead of my files of the density in SampleParticle, it does work. So maybe something is wrong with my density NetCDF files. So the header of one of these files is also shown below.

class SampleParticle(JITParticle):         # Define a new particle class
    rho = Variable('rho', initial=fset.U)  # Variable 'rho' initialised by sampling the density
    drho = Variable('drho', initial= 0)
    rho_0 = Variable('rho_0', initial = fset.U )

The complete error is:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-13-1b6d3027f34d> in <module>
      1 times = [datetime(2013, 1, 10)]*len(lons)
      2 
----> 3 pset = ParticleSet(fieldset=fset, pclass=SampleParticle, lon=lons, lat=lats, depth = depths,  time = times)
      4 kernels= pset.Kernel(AdvectionRK4_3D_2) + pset.Kernel(periodicBC) + pset.Kernel(SampleP)   #Periodic boundaries (longitude)

~/sources/parcels_2_beta/parcels/particleset.py in __init__(self, fieldset, pclass, lon, lat, depth, time, repeatdt, **kwargs)
     99 
    100             for i in range(size):
--> 101                 self.particles[i] = pclass(lon[i], lat[i], fieldset=fieldset, depth=depth[i], cptr=cptr(i), time=time[i])
    102                 # Set other Variables if provided
    103                 for kwvar in kwargs:

~/sources/parcels_2_beta/parcels/particle.py in __init__(self, *args, **kwargs)
    222             ptype = self.getPType()
    223             self._cptr = np.empty(1, dtype=ptype.dtype)[0]
--> 224         super(JITParticle, self).__init__(*args, **kwargs)
    225 
    226         fieldset = kwargs.get('fieldset')

~/sources/parcels_2_beta/parcels/particle.py in __init__(self, lon, lat, fieldset, depth, time, cptr)
    182         type(self).fileid.initial = -1  # -1 means particle is not written yet
    183         type(self).dt.initial = None
--> 184         super(ScipyParticle, self).__init__()
    185 
    186     def __repr__(self):

~/sources/parcels_2_beta/parcels/particle.py in __init__(self)
    129                                  'Add a "time=" to ParticleSet construction')
    130                     exit(-1)
--> 131                 initial = v.initial[time, depth, lat, lon]
    132             else:
    133                 initial = v.initial

~/sources/parcels_2_beta/parcels/field.py in __getitem__(self, key)
    333 
    334     def __getitem__(self, key):
--> 335         return self.eval(*key)
    336 
    337     def calc_cell_edge_sizes(self):

~/sources/parcels_2_beta/parcels/field.py in eval(self, time, z, y, x, applyConversion)
    739         time -= periods*(self.grid.time[-1]-self.grid.time[0])
    740         if ti < self.grid.tdim-1 and time > self.grid.time[ti]:
--> 741             f0 = self.spatial_interpolation(ti, z, y, x, time)
    742             f1 = self.spatial_interpolation(ti + 1, z, y, x, time)
    743             t0 = self.grid.time[ti]

~/sources/parcels_2_beta/parcels/field.py in spatial_interpolation(self, ti, z, y, x, time)
    686             val = self.interpolator2D(ti, z, y, x)
    687         else:
--> 688             val = self.interpolator3D(ti, z, y, x, time)
    689         if np.isnan(val):
    690             # Detect Out-of-bounds sampling and raise exception

~/sources/parcels_2_beta/parcels/field.py in interpolator3D(self, ti, z, y, x, time)
    645             return (1-zeta) * f0 + zeta * f1
    646         elif self.interp_method is 'linear':
--> 647             data = self.data[ti, zi, :, :]
    648             f0 = (1-xsi)*(1-eta) * data[yi, xi] + \
    649                 xsi*(1-eta) * data[yi, xi+1] + \

~/sources/parcels_2_beta/parcels/field.py in __getitem__(self, key)
   1219     """Class used for throwing error when Field.data is not read in deferred loading mode"""
   1220     def __getitem__(self, key):
-> 1221         raise RuntimeError('Field is in deferred_load mode, so can''t be accessed. Use .computeTimeChunk() method to force loading of  data')
   1222 
   1223 

RuntimeError: Field is in deferred_load mode, so cant be accessed. Use .computeTimeChunk() method to force loading of  data

Code:

import numpy as np
from parcels import FieldSet, ParticleSet, ScipyParticle, JITParticle, ErrorCode, AdvectionRK4_3D, Field, VectorField, Variable
from datetime import timedelta
from datetime import datetime
from glob import glob

datadir = '/data2/imau/oceanparcels/hydrodynamic_data/NEMO-MEDUSA/ORCA025-N006/'
datadirrho = '/scratch/Maarten/RHO_data_025/'

ufiles = sorted(glob(datadir+'means/ORCA025-N06_201301??d05U.nc'))
vfiles = sorted(glob(datadir+'means/ORCA025-N06_201301??d05V.nc'))
wfiles = sorted(glob(datadir+'means/ORCA025-N06_201301??d05W.nc'))
rhofiles =  sorted(glob(datadirrho + 'ORCA025-N06_201301??d05RHO.nc'))

filenames = {'U': {'lon' : datadir + 'domain/coordinates.nc', 'lat': datadir + 'domain/coordinates.nc', 'depth': wfiles[0], 'data': ufiles},
             'V': {'lon' : datadir + 'domain/coordinates.nc', 'lat': datadir + 'domain/coordinates.nc', 'depth': wfiles[0], 'data': vfiles},
             'W': {'lon' : datadir + 'domain/coordinates.nc', 'lat': datadir + 'domain/coordinates.nc', 'depth': wfiles[0], 'data': wfiles}}

variables = {'U': 'uo', 'V': 'vo', 'W': 'wo'}
dimensions = {'lon': 'glamf', 'lat': 'gphif', 'depth': 'depthw', 'time': 'time_counter'}

fset = FieldSet.from_nemo(filenames,variables, dimensions, allow_time_extrapolation=False)
fset.W.set_scaling_factor(-1.)

dimensions_rho = dimensions = {'lon': 'nav_lon', 'lat': 'nav_lat', 'depth': 'deptht', 'time': 'time_counter'}
rho = Field.from_netcdf(rhofiles, 'rho', dimensions_rho, allow_time_extrapolation=False, grid = fset.U.grid, interp_method = 'linear')

fset.add_field(rho)

lons = []
lats = []
depths = []

for i in range (100):
    for j in range (30):
        lats.append(-40 - j)
        lons.append(i*3.6)
        depths.append(1500)

class SampleParticle(JITParticle):         # Define a new particle class
    rho = Variable('rho', initial=fset.rho)  # Variable 'rho' initialised by sampling the density
    drho = Variable('drho', initial= 0)
    rho_0 = Variable('rho_0', initial = fset.rho )

def SampleP(particle, fieldset, time):  # Custom function that samples fieldset.rho at particle location
    particle.rho = fieldset.rho[time, particle.depth, particle.lat, particle.lon]
    particle.drho = fieldset.rho[time, particle.depth, particle.lat, particle.lon] - particle.rho_0

times = [datetime(2013, 1, 10)]*len(lons)

pset = ParticleSet(fieldset=fset, pclass=SampleParticle, lon=lons, lat=lats, depth = depths,  time = times)

Header:

dimensions:
        x = 1442 ;
        y = 1021 ;
        deptht = 75 ;
        time_counter = UNLIMITED ; // (1 currently)
variables:
        float nav_lon(y, x) ;
                nav_lon:axis = "X" ;
                nav_lon:units = "degrees_east" ;
                nav_lon:standard_name = "longitude" ;
        float nav_lat(y, x) ;
                nav_lat:axis = "Y" ;
                nav_lat:units = "degrees_north" ;
                nav_lat:standard_name = "latitude" ;
        float deptht(deptht) ;
                deptht:axis = "Z" ;
                deptht:standard_name = "depth" ;
                deptht:units = "m" ;
                deptht:positve = "down" ;
        double time_counter(time_counter) ;
                time_counter:standard_name = "time" ;
                time_counter:units = "seconds since 1900-01-01 00:00:00" ;
                time_counter:calendar = "gregorian" ;
        float rho(time_counter, deptht, y, x) ;
                rho:least_significant_digit = 6 ;
                rho:standard_name = "in situ density" ;
                rho:units = "kg/m^3" ;

// global attributes:
                :_NCProperties = "version=1|netcdflibversion=4.6.1|hdf5libversion=1.10.4" ;

Maarten

erikvansebille commented 5 years ago

Dear Maarten,

First of all, it is good practice to use markdown on GitHub and on gitter.im. I have now updated your comment by adding three quotes (```python) as the beginning and end of your code. That makes it much more readable.

Then, it is also very important to say where the error occurred (i.e. on which line). A bit more of the 'traceback' (i.e. what the errors before this last error are) will help find the bug.

So please add that, and then we can try identify what's going on

erikvansebille commented 5 years ago

This has now been fixed, I understand, by moving the rho field reading into the dictionaries of the FieldSet.from_netcdf call directly