Closed olafurrognvaldsson closed 4 years ago
IIRC this allows the files to be compressed more aggressively and the idea was that the dataset was already compete. What is the problem being caused by this?
The files can not be read by the LVC code. I think this Stackoverflow issue is relevant - https://stackoverflow.com/questions/20340977/using-mfdataset-to-combine-netcdf-files-in-python
@kstanislawska could you give a more accurate description?
The MFDataset can only read a collection of files if their time dimension is unlimited. And I suppose it is out of the question to combine them into a single file.
yes, a typical size for a one-year batch is ~350GB
DropDigits does not label the time-variable as
unlimited
when writing it to a new file