Closed GypsyBojangles closed 4 years ago
I ran into the same problem today. Has anyone found an answer?
Found a related answer in:
https://github.com/opendatacube/datacube-core/issues/161
I ended up using datacube.storage.storage.write_dataset_to_netcdf
to write to a NetCDF format.
Would you say that this response sufficient to close this issue @omad ?
(@omad is away, so I'll chime in) I think leave this issue open, as we should still update the code to make the builtin functions work, even if it saves a non-CF NetCDF file.
Since we implemented our load function, xarray has added a "load from rasterio" feature that standardises the way CRS is attached. (Much simpler than the NetCDF-CF model for defining CRS)
We should update to match this, and go from there. I think the main things t do are to remove the CRS object and replace it with a string, and drop units
for time
, as it is a datetime64 object, rather than a number that needs units.
Same issue here. @andrewdhicks solution works.
related: #519
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
It's been a while and several versions have been released in the meantime. The valid import is, at the time of writing:
from datacube.drivers.netcdf import write_dataset_to_netcdf
The datacube.load() function returns xarray.Dataset objects that contain attributes that can't be saved to NetCDF format by the member function. Objects that can't be serialized: CRS object, flags_definition and spectral_definition dicts. CF attributes that xarray doesn't expect when it writes them,as it wants to write them to match the data: e.g. units for time coordinates. The dataset is structured to be easy to work with, so it is missing the CRS NetCDF data variables required by the CF conventions. This is only added when we write out files using Datacube's NetCDF writer.