brey / pyPoseidon

Framework for hydrodynamic simulations of geoflows
https://pyposeidon.readthedocs.io/
European Union Public License 1.2
7 stars 8 forks source link

Consider creating compressed netcdf files #25

Open pmav99 opened 3 years ago

pmav99 commented 3 years ago

The output NetCDF files are not compressed. This can easily be checked with source:

ncdump -s -h ../path/to/output.nc | grep -i deflate

Creating compressed netcdfs will greatly reduce the file size and is something we should consider.

brey commented 3 years ago

Good tip. I will look into it.

vvoukouvalas commented 3 years ago

Some years ago, I did some tests with the netcdf compression levels (1-9). The compression level no.4 was the best in terms of speed and compression. The tests were done with matlab's snctools (a) but I guess the same should hold true for the other netcdf libraries. Additionally, we can reduce the file size, by converting the original double precision data to integer* ones, after converting the original values (e.g. water level in meters to mm) without loosing accuracy.

a) http://mexcdf.sourceforge.net/

pmav99 commented 3 years ago

The compression level no.4 was the best in terms of speed and compression.

I agree that as a rule of thumb it is a good choice.

by converting the original double precision data to integer* ones, after converting the original values (e.g. water level in meters to mm) without loosing accuracy

True, that being said, depending on the application, even losing accuracy is something that can occasionally be considered.

pmav99 commented 3 years ago

Implementing this will require to pass a parameter named encoding to the to_netcdf() calls.

http://xarray.pydata.org/en/stable/generated/xarray.Dataset.to_netcdf.html

encoding (dict, optional) – Nested dictionary with variable names as keys and dictionaries of variable specific encodings as values, e.g., {"my_variable": {"dtype": "int16", "scale_factor": 0.1, "zlib": True}, ...}

The h5netcdf engine supports both the NetCDF4-style compression encoding parameters {"zlib": True, "complevel": 9} and the h5py ones {"compression": "gzip", "compression_opts": 9}. This allows using any compression plugin installed in the HDF5 library, e.g. LZF.

Not necessarily related to the issue at hand, but the following links provide some interesting insights WRT data formats and compression:

https://docs.dask.org/en/latest/best-practices.html#store-data-efficiently https://docs.dask.org/en/latest/dataframe-best-practices.html#store-data-in-apache-parquet-format

pmav99 commented 3 years ago

If the netcdf files are being created with the xarray.DataSet.to_netcdf() method, then we need to provide a value for the encoding parameter. It should be a dictionary of the form

{"my_variable": {"dtype": "int16", "scale_factor": 0.1, "zlib": True}, ...}

http://xarray.pydata.org/en/stable/generated/xarray.Dataset.to_netcdf.html