We implement a zarr conversion capability in post that will enable:
creating zarr archives
deleting netcdf files
This PR lead to a major overhaul of netcdf file reading which also improves it.
By default, we do not chunk time and vertical dimensions anymore.
This was counter productive for small simulations.
For large simulations chunks will have to be provided
Hard parts:
one need to check for chunk sizes before storing zarr archives as it is important that these chunks are not smaller than some size (approx 4000x3000).
add mechanisms to search for all available outputs and verify they have been stored.
The PR also proposes several the renaming of several variables (breaking changes):
open_nc -> outputs , note that you can pass the option 'all' now which loads all available diagnostics
filename -> nc_files
suffix -> key
chunk_time -> chunks a more general parameter to control chunks when reading files.
croco vmode outputs files are excluded from the zarr conversion and deleting
We implement a zarr conversion capability in post that will enable:
This PR lead to a major overhaul of netcdf file reading which also improves it. By default, we do not chunk time and vertical dimensions anymore. This was counter productive for small simulations. For large simulations chunks will have to be provided
Hard parts:
The PR also proposes several the renaming of several variables (breaking changes):
open_nc
->outputs
, note that you can pass the option 'all' now which loads all available diagnosticsfilename
->nc_files
suffix
->key
chunk_time
->chunks
a more general parameter to control chunks when reading files.croco vmode outputs files are excluded from the zarr conversion and deleting