Open prl900 opened 4 years ago
The function to save xarray datasets into netCDF4 write_dataset_to_netcdf seems to be broken. The output when running on a DataCube Dataset output returns an HDF error.
write_dataset_to_netcdf
Is there an alternative way of storing data in netCDF4 format?
This is the code to reproduce the error:
from datacube import Datacube from datacube.drivers.netcdf import write_dataset_to_netcdf dc = Datacube(app='Sentinel2') query = { 'lat': (-34.90, -34.96), 'lon': (138.57, 138.63), 'output_crs': 'EPSG:3577', 'resolution': (-20, 20), 'measurements': ["nbar_red","nbar_green","nbar_blue"], 'time': ('2019-01-01', '2019-03-01') } adelaide = dc.load(product='s2a_ard_granule', group_by='solar_day', **query) write_dataset_to_netcdf(adelaide, 'adelaide2019.nc')
Which returns this error on my Sandbox:
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) <ipython-input-3-88dbb3a653d0> in <module> 14 15 adelaide = dc.load(product='s2a_ard_granule', group_by='solar_day', **query) ---> 16 write_dataset_to_netcdf(adelaide, 'adelaide2019.nc') /usr/local/lib/python3.6/dist-packages/datacube/drivers/netcdf/_write.py in write_dataset_to_netcdf(dataset, filename, global_attributes, variable_params, netcdfparams) 92 variable_params, 93 global_attributes, ---> 94 netcdfparams) 95 96 for name, variable in dataset.data_vars.items(): /usr/local/lib/python3.6/dist-packages/datacube/drivers/netcdf/_write.py in create_netcdf_storage_unit(filename, crs, coordinates, variables, variable_params, global_attributes, netcdfparams) 39 40 for name, coord in coordinates.items(): ---> 41 netcdf_writer.create_coordinate(nco, name, coord.values, coord.units) 42 43 netcdf_writer.create_grid_mapping_variable(nco, crs) /usr/local/lib/python3.6/dist-packages/datacube/drivers/netcdf/writer.py in create_coordinate(nco, name, labels, units) 93 nco.createDimension(name, labels.size) 94 var = nco.createVariable(name, labels.dtype, name) ---> 95 var[:] = labels 96 97 var.units = units /usr/local/lib/python3.6/dist-packages/datacube/drivers/netcdf/_safestrings.py in __setitem__(self, key, value) 40 41 def __setitem__(self, key, value): ---> 42 self._wrapped.__setitem__(key, value) 43 44 def setncattr(self, name, value): netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.__setitem__() netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable._put() netCDF4/_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success() RuntimeError: NetCDF: HDF error
Can confirm, downgrading netCDF4 libs seems to fix the problem:
pip install --user netCDF4==1.5.2
The function to save xarray datasets into netCDF4
write_dataset_to_netcdf
seems to be broken. The output when running on a DataCube Dataset output returns an HDF error.Is there an alternative way of storing data in netCDF4 format?
This is the code to reproduce the error:
Which returns this error on my Sandbox: