Unidata / netcdf-c

Official GitHub repository for netCDF-C libraries and utilities.
BSD 3-Clause "New" or "Revised" License
521 stars 262 forks source link

Current issues renaming dimensions and coordinates #597

Open czender opened 7 years ago

czender commented 7 years ago

Old issues with renaming dimensions and coordinates in netCDF4 files are beginning to affect CMIP6 processing so here's a reminder. All recent library versions are affected. See here for a chronology. Create a netCDF4 file with a coordinate:

netcdf bug_rnm {
       dimensions:
            lon = 4 ;
       variables:
           float lon(lon) ;
       data:
          lon = 0, 90, 180, 270 ; 
}

Remember to re-create the file after each of the three tests below. Renaming both dimension and coordinate together works. Yay! This surprises me, given the history recounted above:

     ncrename -d lon,longitude -v lon,longitude ~/bug_rnm.nc # works

Renaming dimension then coordinate separately fails:

       ncrename -d lon,longitude ~/bug_rnm.nc
       ncrename -v lon,longitude ~/bug_rnm.nc # borken "HDF error"

Renaming coordinate then dimension separately fails:

       ncrename -v lon,longitude ~/bug_rnm.nc
       ncrename -d lon,longitude ~/bug_rnm.nc # borken "nc4_reform_coord_var: Assertion `dim_datasetid > 0' failed."
WardF commented 7 years ago

Thanks @czender we were flooded with other PR's and I'm trying to manage those combined with the other open issues, we want to get the next bugfix release out shortly, so trying to address the issues you have open right now (in addition to working on some of the PR's Dennis has for compression, etc).

On Mon, Nov 6, 2017 at 8:52 PM, Charlie Zender notifications@github.com wrote:

Old issues with renaming dimensions and coordinates in netCDF4 files are beginning to affect CMIP6 processing so here's a reminder. All recent library versions are affected. See here http://nco.sf.net/nco.html#bug_nc4_rename for a chronology. Create a netCDF4 file with a coordinate:

netcdf bug_rnm { dimensions: lon = 4 ; variables: float lon(lon) ; data: lon = 0, 90, 180, 270 ; }

Remember to re-create the file after each of the three tests below. Renaming both dimension and coordinate together works. Yay! This surprises me, given the history recounted above:

ncrename -d lon,longitude -v lon,longitude ~/bug_rnm.nc # works

Renaming dimension then coordinate separately fails:

   ncrename -d lon,longitude ~/bug_rnm.nc
   ncrename -v lon,longitude ~/bug_rnm.nc # borken "HDF error"

Renaming dimension then coordinate separately fails:

   ncrename -v lon,longitude ~/bug_rnm.nc
   ncrename -d lon,longitude ~/bug_rnm.nc # borken "nc4_reform_coord_var: Assertion `dim_datasetid > 0' failed."

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/Unidata/netcdf-c/issues/597, or mute the thread https://github.com/notifications/unsubscribe-auth/AEH-UmNZUGFrWoYswzAw4jyP-JQVCt1Cks5sz9PzgaJpZM4QUOl- .

edhartnett commented 7 years ago

Howdy @czender !

I found a test written by Quincey that deals with renaming, nc_test4/tst_rename.c.

I have add to the existing tests, and reproduced your error in C code. Here's the test:

 fprintf(stderr,"*** Test Charlie's test for renaming...");
   {
#define CHARLIE_TEST_FILE "nc_rename_coord_dim.nc"
#define DIM1_NAME "lon"
#define VAR1_NAME "lon"
#define DIM2_NAME "longitude"
#define VAR2_NAME "longitude"
#define DIM1_LEN 4
#define NDIM1 1
      int ncid, dimid, varid;
      float data[DIM1_LEN] = {0, 90.0, 180.0, 270.0};

      if (nc_create(CHARLIE_TEST_FILE, NC_NETCDF4, &ncid)) ERR;
      if (nc_def_dim(ncid, DIM1_NAME, DIM1_LEN, &dimid)) ERR;
      if (nc_def_var(ncid, VAR1_NAME, NC_FLOAT, NDIM1, &dimid, &varid)) ERR;
      if (nc_enddef(ncid)) ERR;
      if (nc_put_var_float(ncid, varid, data)) ERR;
      if (nc_close(ncid)) ERR;

      /* Reopen the file to check. */
      if (check_charlies_file(CHARLIE_TEST_FILE, DIM1_NAME, VAR1_NAME)) ERR;

      /* Open the file and rename the dimension. */
      if (nc_open(CHARLIE_TEST_FILE, NC_WRITE, &ncid)) ERR;
      if (nc_redef(ncid)) ERR;
      if (nc_rename_dim(ncid, 0, DIM2_NAME)) ERR;
      if (nc_enddef(ncid)) ERR;
      if (nc_close(ncid)) ERR;

      /* Reopen the file to check. */
      if (check_charlies_file(CHARLIE_TEST_FILE, DIM2_NAME, VAR1_NAME)) ERR;

      /* Open the file and rename the variable. */
      if (nc_open(CHARLIE_TEST_FILE, NC_WRITE, &ncid)) ERR;
      if (nc_redef(ncid)) ERR;
      if (nc_rename_var(ncid, 0, VAR2_NAME)) ERR;
      if (nc_enddef(ncid)) ERR;
      if (nc_close(ncid)) ERR;

      /* Reopen the file to check. */
      if (check_charlies_file(CHARLIE_TEST_FILE, VAR2_NAME, VAR1_NAME)) ERR;

This fails like this:

*** Test Charlie's test for renaming...HDF5-DIAG: Error detected in HDF5 (1.10.1) thread 0:
  #000: H5Gdeprec.c line 459 in H5Gmove(): couldn't move link
    major: Links
    minor: Unable to initialize object
  #001: H5Gdeprec.c line 541 in H5G_move(): unable to move link
    major: Links
    minor: Can't move object
  #002: H5L.c line 2763 in H5L_move(): unable to find link
    major: Symbol table
    minor: Object not found
  #003: H5Gtraverse.c line 867 in H5G_traverse(): internal path traversal failed
    major: Symbol table
    minor: Object not found
  #004: H5Gtraverse.c line 639 in H5G_traverse_real(): traversal operator failed
    major: Symbol table
    minor: Callback failed
  #005: H5L.c line 2623 in H5L_move_cb(): unable to follow symbolic link
    major: Symbol table
    minor: Object not found
  #006: H5Gtraverse.c line 867 in H5G_traverse(): internal path traversal failed
    major: Symbol table
    minor: Object not found
  #007: H5Gtraverse.c line 639 in H5G_traverse_real(): traversal operator failed
    major: Symbol table
    minor: Callback failed
  #008: H5L.c line 2484 in H5L_move_dest_cb(): an object with that name already exists
    major: Symbol table
    minor: Object not found
Sorry! Unexpected result, tst_rename.c, line: 301

I will take a look and see what is going on here...

edhartnett commented 7 years ago

OK, the problem can be seen from the h5dump:

HDF5 "nc_rename_coord_dim.nc" {
GROUP "/" {
   ATTRIBUTE "_NCProperties" {
      DATATYPE  H5T_STRING {
         STRSIZE 67;
         STRPAD H5T_STR_NULLTERM;
         CSET H5T_CSET_ASCII;
         CTYPE H5T_C_S1;
      }
      DATASPACE  SCALAR
      DATA {
      (0): "version=1|netcdflibversion=4.5.1-development|hdf5libversion=1.10.1"
      }
   }
   DATASET "lon" {
      DATATYPE  H5T_IEEE_F32LE
      DATASPACE  SIMPLE { ( 4 ) / ( 4 ) }
      DATA {
      (0): 0, 90, 180, 270
      }
      ATTRIBUTE "DIMENSION_LIST" {
         DATATYPE  H5T_VLEN { H5T_REFERENCE { H5T_STD_REF_OBJECT }}
         DATASPACE  SIMPLE { ( 1 ) / ( 1 ) }
         DATA {
         (0): (DATASET 659 /longitude )
         }
      }
      ATTRIBUTE "_Netcdf4Dimid" {
         DATATYPE  H5T_STD_I32LE
         DATASPACE  SCALAR
         DATA {
         (0): 0
         }
      }
   }
   DATASET "longitude" {
      DATATYPE  H5T_IEEE_F32BE
      DATASPACE  SIMPLE { ( 4 ) / ( 4 ) }
      DATA {
      (0): 0, 0, 0, 0
      }
      ATTRIBUTE "CLASS" {
         DATATYPE  H5T_STRING {
            STRSIZE 16;
            STRPAD H5T_STR_NULLTERM;
            CSET H5T_CSET_ASCII;
            CTYPE H5T_C_S1;
         }
         DATASPACE  SCALAR
         DATA {
         (0): "DIMENSION_SCALE"
         }
      }
      ATTRIBUTE "NAME" {
         DATATYPE  H5T_STRING {
            STRSIZE 64;
            STRPAD H5T_STR_NULLTERM;
            CSET H5T_CSET_ASCII;
            CTYPE H5T_C_S1;
         }
         DATASPACE  SCALAR
         DATA {
         (0): "This is a netCDF dimension but not a netCDF variable.         4"
         }
      }
      ATTRIBUTE "REFERENCE_LIST" {
         DATATYPE  H5T_COMPOUND {
            H5T_REFERENCE { H5T_STD_REF_OBJECT } "dataset";
            H5T_STD_I32LE "dimension";
         }
         DATASPACE  SIMPLE { ( 1 ) / ( 1 ) }
         DATA {
         (0): {
               DATASET 331 /lon ,
               0
            }
         }
      }
   }
}
}

When the code hits this line in NC4_rename_var() it fails:

  /* Change the HDF5 file, if this var has already been created
      there. */
   if (var->created)
   {
      if (H5Gmove(grp->hdf_grpid, var->name, name) < 0)
         BAIL(NC_EHDFERR);
   }

So the code is trying to rename the dataset with H5Gmove, but can't because there is already a dataset with the new name. It was created to support the dimension name change - its one of those dreaded secret datasets to support HDF5 dimension scales.

I guess the way to proceed is to check before the H5Gmove to see if there is another var of the desired name which secretly exists, and then delete it, and then make the newly renamed dataset into a dimension scale.

I will try to take a stab at that...

czender commented 7 years ago

Thanks, @edhartnett. It sounds like an intricate fix, so I hope you #persist.

cameronsmith1 commented 7 years ago

Thanks, All!

edhartnett commented 7 years ago

This is happening on this branch: https://github.com/NetCDF-World-Domination-Council/netcdf-c/tree/ejh_zender_coordinate_var_rename

So far I have just added the new test that demonstrates the problem.

edhartnett commented 6 years ago

@czender I am working on this now as part of my current branch. I have been sneaking up on this by writing tests for lots of simpler code in nc4var.c. Now I have done the easy stuff and am moving on to the rename bugs.

To help us understand the urgency of this fix, if any, can you elaborate on how it is being used in CMIP6? And what CMIP6 is?

Are we talking world-shatteringly important climate research here? Or what?

And why the heck are you renaming vars anyway? Can't you scientists just make up your minds? What is the use-case that is driving this? Can you help me understand the importance of this feature?

Thanks!

PS I am going to fix it anyway, obviously. You would have to physically restrain me to stop me from fixing it. I am just curious why you need it.

cameronsmith1 commented 6 years ago

CMIP6 is the large international climate model intercomparison for the next IPCC report. The CMIP6 organizers provided data in netcdf files for all of the different modeling groups to use. However, each climate model typically looks for a particular name for each input variable it needs, which varies from model to model (eg, sea_surface_temperature, SST, surfTemp), and there are hundreds of these, especially when we have to reprocess our output to match the CMIP6 required variable names. Hence the need to rename the variables.

The good news is that I have already worked around this bug (it was too important to wait), but it will be good to get it fixed for the future.

cameronsmith1 commented 6 years ago

I actually use the NCO utilities, so this is a question for @czender .

DennisHeimbigner commented 6 years ago

Ed,In retrospect, I sure wih you guys had not been seduced by dimension scales :-)

edhartnett commented 6 years ago

Mea culpa, mea culpa, mea maxima culpa.

edhartnett commented 6 years ago

@cameronsmith1 thanks for the explanation.

What workaround are you using?

cameronsmith1 commented 6 years ago

I ended up using the nco utilities of @czender in a different way, so I am not sure how to express it in native HDF commands.

czender commented 6 years ago

I receive a few bug reports a year from people trying to rename coordinates in netCDF4 files with ncrename. CMIP does this on an industrial scale. Most people end up using the multistep workaround(s) documented in the NCO manual here.

edhartnett commented 6 years ago

OK I have a fix for this problem, as well as several other rename problems. I am still working on another one I found.

There is a bunch of state code in this which is not helpful. For example there are these values in NC_VAR_INFO_T:

   nc_bool_t is_new_var;        /* True if variable is newly created */
   nc_bool_t was_coord_var;     /* True if variable was a coordinate var, but either the dim or var has been renamed */
   nc_bool_t became_coord_var;  /* True if variable _became_ a coordinate var, because either the dim or var has been renamed */

These are set at various times in nc_rename_var() and nc_rename_dim().

Then in the sync that is called during nc_enddef() these values are consulted and various actions taken.

Problem is, it's not hard to come up with combinations of rename calls which confuse the crap out of the code. There are so many possible ways that users can name and rename vars and dims, with enddefs/redefs thrown in or not, It gets very, very confusing.

What would work better would be to assume a statelessness. Whenever a rename command is complete, the file on disk and in metadata must be completely adjusted and in a complete state. There must be nothing left over of the rename process for enddef to do. All work must be done in nc_rename_var() and nc_rename_dim().

If that were true, then the user could call them in any wild order, and it would not matter to us.

I am not attempting to take these out right now. I am patching the existing code so that it works for the test cases that we've identified, including Charlie's. But I suspect there are many more bugs out there with the existing code that will only be eliminated by removing these state variables and dealing with everything in the rename.

edhartnett commented 6 years ago

OK, these fixes are up as PR #755. It's branch ejh_fill_values on my netcdf clone: git@github.com:NetCDF-World-Domination-Council/netcdf-c.git

@czender if you could grab the branch and give it a try, that would be great. I don't think I've fixed every possible rename problem, but see if your common cases now work OK.

Any problems would be best expressed by adding a test to nc_test4/tst_rename.c. Simply copy an existing test case, then change it to make it fail, and send me the code or post it here.

cameronsmith1 commented 6 years ago

Thanks for working on this. At the web-page that @czender mentioned, there is a long list of problems over the years with renaming variables, which supports your conclusion that the current implementation is fragile against the many possible renaming situations. Interestingly, @czender notes that a robust solution is to convert from netcdf4 to netcdf3, then rename and convert back to netcdf4. Does the implementation of rename for netcdf3 offer a useful template?

czender commented 6 years ago

Thanks, @edhartnett. I will try to test this soon and report the results here.

edhartnett commented 6 years ago

@czender you should actually use branch currently up as PR #763.

This PR contains just the rename fixes from the other PR, now closed without merging.

In tst_rename.c there are some commented-out lines which show an unfixed bug:

       /* Open the file and rename the variable. This will remove
          * the dimscale-only dataset "longitude" and rename the
          * extisting dataset "lon" to "longitude". Variable
          * "longitude" will become a coordinate var. */
         /* if (nc_open(CHARLIE_TEST_FILE, NC_WRITE, &ncid)) ERR; */
         /* if (nc_redef(ncid)) ERR; */
         /* if (nc_rename_var(ncid, 0, LONGITUDE)) ERR; */
         /* if (nc_enddef(ncid)) ERR; */
         /* if (nc_close(ncid)) ERR; */

Still working on this one. ;-)

czender commented 6 years ago

I got as far as re-building and testing the latest netCDF master four days ago. It produced a total of 7 new and unexpected failures with the NCO regression test, half with ncks and half with ncpdq. All new failures involved netCDF4 files. I have not yet figured out the exact cause. Tracking that down has higher priority than testing the rename fix for now.

edhartnett commented 6 years ago

Well it might be a good idea for us to use the NCO tests as another set of testing for netCDF...

On Tue, Jan 9, 2018 at 12:27 PM, Charlie Zender notifications@github.com wrote:

I got as far as re-building and testing the latest netCDF master four days ago. It produced a total of 7 new and unexpected failures with the NCO regression test, half with ncks and half with ncpdq. All new failures involved netCDF4 files. I have not yet figured out the exact cause. Tracking that down has higher priority than testing the rename fix for now.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Unidata/netcdf-c/issues/597#issuecomment-356388116, or mute the thread https://github.com/notifications/unsubscribe-auth/AEUr3EJohcZQc3k1PnO5TtXCfr2bvTYuks5tI72igaJpZM4QUOl- .

cameronsmith1 commented 6 years ago

NCO is extremely widely used in the climate community for working with netcdf files, so including them in your test suite would be wonderful.

WardF commented 6 years ago

We already do as part of our testing. Unfortunately there are two sets of tests: ‘make check’ which will return non-zero when there is a failure, and ‘make test’ which requires manual intervention to catch failures. We run the firmer but not the latter.

WardF commented 6 years ago

@czender are you seeing failures in make check or make test? I am not seeing failures with make check. I will check with make test after my poster session.

czender commented 6 years ago

As you suspect it is with "make test", which is always what I mean by "the NCO regression test". For historical reasons, NCO's "make check" is basically just a check that the NCO C++ code library links and runs a single, simple program.

czender commented 6 years ago

After a little investigation it turns out that this was a problem with "make test" relying on a file, hdn.nc, that autoconf apparently does not re-create, it is not a netCDF or NCO code issue. Phew. Now I can move on to testing the rename patch...

WardF commented 6 years ago

Excellent! I will hold off, let me know if any regressions pop up. Thank you @czender!

edhartnett commented 6 years ago

@czender I've just put up another PR (#830) which I believe fixes the test case that you gave me for this issue.

Not sure if all possible rename issues are solved. But if you could try branch ejh_rename_is_here from https://github.com/NetCDF-World-Domination-Council/netcdf-c, and let me know if it now passes your rename tests, that would be great.

Of course, if it does not, I will need the next test case where it fails. ;-)

czender commented 6 years ago

Thank you for your edforts. It continues to fail on the test previously reported.

zender@skyglow:~$ ncks --lbr
Linked to netCDF library version 4.6.0.1 compiled Feb  1 2018 06:23:57
zender@skyglow:~$ ncrename -O -d lev,z -d lat,y -d lon,x ~/nco/data/in_grp.nc ~/foo.nc
zender@skyglow:~$ ncks -H --trd -s %d -v one ~/foo.nc
ncks: INFO nco_bld_dmn_ids_trv() reports variable </g5/one_dmn_rec_var> with duplicate dimensions
ncks: ERROR netCDF file with duplicate dimension IDs detected. Please use netCDF version at least 4.3.0. NB: Simultaneously renaming multiple dimensions with ncrename can trigger this bug with netCDF versions up to 4.6.0.1 (current as of 20180201).
ncks: INFO reports group information
/: 11 subgroups, 7 dimensions, 1 record dimensions, 5 attributes, 18 variables
 #3 dimension: 'coord'(2)
 #4 record dimension: 'time'(10)
 #5 dimension: 'vrt_nbr'(2)
 #6 dimension: 'gds_crd'(8)
 #7 dimension: 'y'(2)
 #8 dimension: 'x'(4)
 #9 dimension: 'z'(3)
/g1: 2 subgroups, 0 dimensions, 0 record dimensions, 1 attributes, 8 variables
/g1/g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 5 variables
/g1/g1:g2: 0 subgroups, 0 dimensions, 0 record dimensions, 2 attributes, 0 variables
/g2: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 3 variables
/g4: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 4 variables
/g5: 0 subgroups, 2 dimensions, 2 record dimensions, 0 attributes, 3 variables
 #7 record dimension: 'time51'(2)
 #8 record dimension: 'time52'(10)
/g6: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 2 variables
/g6/g6g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g7: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 6 variables
/g7/g7g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g8: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1/g8g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1/g8g1g1/g8g1g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 2 attributes, 0 variables
/g9: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g9/g9g1/g9g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 1 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1/g9g1g1g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1/g9g1g1g1g1g1/g9g1g1g1g1g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 4 variables
/g10: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 2 variables
/g11: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 17 variables
/g12: 0 subgroups, 0 dimensions, 0 record dimensions, 20 attributes, 0 variables

ncks: INFO reports variable information
Segmentation fault (core dumped)
edhartnett commented 6 years ago

Can you build with --enable-logging, and then call nc_set_log_level(5) before running those commands?

I have tried to build NCO here but could not get it to work.

czender commented 6 years ago

NCO branch "log_level" now calls nc_set_log_level() with the --log_level argument. As shown below this produces no additional output. I have reformatted the crucial information so you can see the issue, namely that, after the renaming, the output file contains four dimensions (y,x,time51,time52) that share two dimension IDs (7,8). It is my understanding that dimension IDs are (or should be) unique per file, so NCO considers the file corrupt and aborts with the message you see.

zender@skyglow:~$ ncrename --log_level=5 -O -d lev,z -d lat,y -d lon,x ~/nco/data/in_grp.nc ~/foo.nc
zender@skyglow:~$ ncks --log_level=5 -H --trd -s %d -v one ~/foo.nc
ncks: INFO nco_bld_dmn_ids_trv() reports variable </g5/one_dmn_rec_var> with duplicate dimensions
ncks: ERROR netCDF file with duplicate dimension IDs detected. Please use netCDF version at least 4.3.0. NB: Simultaneously renaming multiple dimensions with ncrename can trigger this bug with netCDF versions up to 4.6.0.1 (current as of 20180201).
ncks: INFO reports group information
/: 11 subgroups, 7 dimensions, 1 record dimensions, 5 attributes, 18 variables
Fixed dimension name, size, ID = coord, 2, 3
Record dimension name, size, ID = time, 10, 4
Fixed dimension name, size, ID = vrt_nbr, 2, 5
Fixed dimension name, size, ID = gds_crd, 8, 6
Fixed dimension name, size, ID = y, 2, 7
Fixed dimension name, size, ID = x, 4, 8
Fixed dimension name, size, ID = z, 3, 9
/g1: 2 subgroups, 0 dimensions, 0 record dimensions, 1 attributes, 8 variables
/g1/g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 5 variables
/g1/g1:g2: 0 subgroups, 0 dimensions, 0 record dimensions, 2 attributes, 0 variables
/g2: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 3 variables
/g4: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 4 variables
/g5: 0 subgroups, 2 dimensions, 2 record dimensions, 0 attributes, 3 variables
Record dimension name, size, ID = time51, 2, 7
Record dimension name, size, ID = time52, 10, 8
/g6: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 2 variables
/g6/g6g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g7: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 6 variables
/g7/g7g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g8: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1/g8g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1/g8g1g1/g8g1g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 2 attributes, 0 variables
/g9: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g9/g9g1/g9g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 1 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1/g9g1g1g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1/g9g1g1g1g1g1/g9g1g1g1g1g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 4 variables
/g10: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 2 variables
/g11: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 17 variables
/g12: 0 subgroups, 0 dimensions, 0 record dimensions, 20 attributes, 0 variables

ncks: INFO reports variable information
Segmentation fault (core dumped)
czender commented 6 years ago

I will help you build NCO on your machine if you provide more info. I would like you and @WardF to be able to use NCO's "make test" to catch new regressions.

WardF commented 6 years ago

@czender Would it be possible to add a unique string to netcdf-related messages in 'make test'? Since our testing is automated, I could add a script to parse the output of 'make test' and report on that. Would keying on 'netCDF' be sufficient?

czender commented 6 years ago

Yes. I have altered the log_level branch of NCO so that it can act as a trigger:

make test > ~/foo 2>&1;cat ~/foo | grep FAILED | grep Unidata

will return $?=0 and a short description of the purpose of the test (the test itself can be found by grepping for that string in ~/nco/bm/NCO_rgr.pm) when a test fails and I consider it due to a netCDF library issue, and will return $?=1 (and no string) when there are no such failures. Right now the only failure marked this way is the ncrename issue. Though there may be some false negatives I will refrain from adding the Unidata tag to those right now. Let's start with this, iterate, and see if it proves useful. I will try to be conservative in adding the Unidata tag. Pinging @hmb1 and @pedro-vicente so they are aware of this new convention in the NCO regression tests. I'm not sure what make test does on Windows, but you can probably ignore it because we never got the test suite running on Windows.

pedro-vicente commented 6 years ago

Pinging @hmb1 and @pedro-vicente

ok, officially reporting to duty to one more problem solving issue. I am now marking netcdf-c as "watching"

WardF commented 6 years ago

Sounds good. I will update the non-Travis CI runs in the morning.

edhartnett commented 6 years ago

@czender I can';t solve the rename issues until I reduce to a C test.

Was there some reason you can't run with nc_set_log()?

Can you tell me what operations that NCO command is performing? That is, what is it attempting to do which fails/

edhartnett commented 6 years ago

@czender re-reading your comments I suspect you have built netCDF without --enable-logging in the configure options. Rebuild netCDF with --enable-logging, then see if you can produce log output for what your test is doing when it fails.

edhartnett commented 6 years ago

@czender one more questions: what does ncrename do wrt enddefs?

For example, if I call it to rename a dim and a var, does it do one, call enddef, and do the other? Or does it do both and then call enddef?

czender commented 6 years ago

Perhaps I gave the wrong impression, or I am doing things incorrectly. Let me clarify on my end:

  1. I always build with --enable-logging
  2. I added --log_level to NCO (now on master) so you can set it yourself
  3. I am using branch ejh_rename_is_here
  4. When I nc_set_log_level(5) and run the two commands (i.e., with ncrename --log_level=5) there is no change in output (should there be?)
  5. As explained above, this command completes but creates a "corrupt file" when it renames three coordinate dimensions: ncrename --log_level=5 -O -d lev,z -d lat,y -d lon,x ~/nco/data/in_grp.nc ~/foo.nc
  6. The output file contains multiple dimensions with the same dimension ID. This causes ncks to segfault as it attempts to print some metadata in the "corrupt" file. Because we assume dimension IDs are globally unique (true?)
  7. ncrename never calls nc_redef() or nc_enddef() on netCDF4 files because the documentation says
    
    For netCDF-4 files (i.e. files created with NC_NETCDF4 in the cmode, see nc_create), it is not necessary to call nc_redef unless the file was also created with NC_STRICT_NC3. For straight-up netCDF-4 files, nc_redef is called automatically, as needed.

It's not necessary to call nc_enddef for netCDF-4 files. With netCDF-4 files, nc_enddef is called when needed by the netcdf-4 library. User calls to nc_enddef for netCDF-4 files still flush the metadata to disk.


Thus ncrename does all renaming of dims, vars, atts, groups in netCDF4 files without ever calling `nc_enddef()`. `ncrename` does use `nc_redef()` and `nc_enddef()` for all files except `NC_FORMAT_NETCDF4`. Is this an issue?
8. Hope this helps
edhartnett commented 6 years ago

Thanks for the answers! I will construct a test case in C.

I do expect output from logging.

nc_enddef is important because it causes a sync to disk. At that time, a lot of the dimension scale stuff happens. So it's generally where the problems occur. Enddef is called automatically when you close the file.

So you open the file, issue one or more rename commands, and then close the file, without doing enddef/redef anywhere. THat's good to know. I will set up a C test to match the case you outlined above.

edhartnett commented 6 years ago

@czender I have added a new test program, tst_rename2.c, to HPC NetCDF, to explore this issue.

However, I cannot recreate your problem with the three dimensions. Everything I try seems to work.

For example, here's some code where I create 3 dims, with 3 coord vars, and then rename the dims:

      fprintf(stderr,"*** test renaming 3 dims with coord data format %d...",
              formats[format]);
      {
         char filename[NC_MAX_NAME + 1];
         int ncid, dimid[NDIM3], varid[NDIM3];
         int dimid_in;
         int lat_data[DIM1_LEN] = {0, 1, 2, 3};
         int lon_data[DIM1_LEN] = {0, 10, 20, 30};
         int lev_data[DIM1_LEN] = {0, 100, 200, 300};

         if (nc_set_default_format(formats[format], NULL)) ERR;

         sprintf(filename, "%s_data_%d.nc", TEST_NAME, formats[format]);

         /* Create file with three dims. */
         if (nc_create(filename, 0, &ncid)) ERR;
         if (nc_def_dim(ncid, LAT, DIM1_LEN, &dimid[0])) ERR;
         if (nc_def_dim(ncid, LON, DIM1_LEN, &dimid[1])) ERR;
         if (nc_def_dim(ncid, LEV, DIM1_LEN, &dimid[2])) ERR;

         /* Define coordinate data vars. */
         if (nc_def_var(ncid, LAT, NC_INT, NDIM1, &dimid[0], &varid[0])) ERR;
         if (nc_def_var(ncid, LON, NC_INT, NDIM1, &dimid[1], &varid[1])) ERR;
         if (nc_def_var(ncid, LEV, NC_INT, NDIM1, &dimid[2], &varid[2])) ERR;

         if (nc_enddef(ncid)) ERR;

         if (nc_put_var(ncid, 0, lat_data)) ERR;
         if (nc_put_var(ncid, 1, lon_data)) ERR;
         if (nc_put_var(ncid, 2, lev_data)) ERR;

         if (nc_close(ncid)) ERR;
         if (nc_open(filename, NC_WRITE, &ncid)) ERR;
         if (nc_redef(ncid)) ERR;

         /* Rename the dimensions. */
         if (nc_rename_dim(ncid, 0, DIM_X)) ERR;
         if (nc_rename_dim(ncid, 1, DIM_Y)) ERR;
         if (nc_rename_dim(ncid, 2, DIM_Z)) ERR;

         /* Close the file. */
         if (nc_close(ncid)) ERR;

         /* Reopen the file and check. */
         if (nc_open(filename, NC_NOWRITE, &ncid)) ERR;
         if (nc_inq_dimid(ncid, DIM_X, &dimid_in)) ERR;
         if (dimid_in != 0) ERR;
         if (nc_inq_dimid(ncid, DIM_Y, &dimid_in)) ERR;
         if (dimid_in != 1) ERR;
         if (nc_inq_dimid(ncid, DIM_Z, &dimid_in)) ERR;
         if (dimid_in != 2) ERR;
         if (nc_close(ncid)) ERR;
      }
      SUMMARIZE_ERR;

The above code creates a file with lat, lon, lev, with coordinate vars.

Then it reopens the file, and renames the three dimensions to x, y, z.

The output, as expected, looks like this:

netcdf tst_rename_data_4 {
dimensions:
    x = 4 ;
    y = 4 ;
    z = 4 ;
variables:
    int lat(x) ;
    int lon(y) ;
    int lev(z) ;
data:

 lat = 0, 1, 2, 3 ;

 lon = 0, 10, 20, 30 ;

 lev = 0, 100, 200, 300 ;
}

What do you think your test is doing that mine is not?

Are you also renaming the coord variables, in addition to the dimensions?

Also somehow you are apparently getting the optional dimension IDs. Can you send me an ncdump of the file before the attempted rename, and a h5dump before and after the rename?

czender commented 6 years ago

Let me reiterate that the rename completes but it creates a "corrupt" file with duplicate dimension IDs. How about I help you build or install NCO so you can reproduce the exact problem?

WardF commented 6 years ago

NCO can be built easily for spot debugging via docker, if a local install is a hassle. If you have docker installed you would run:

$ docker run --it --privileged -e RUNC=OFF -e RUNCXX=OFF -e RUNF=OFF -e RUNP=OFF unidata/nctests:serial bash

If you invoke the run_serial.sh script from the command prompt you are dumped into, it will install libnetcdf and then download/build/test NCO. The script doesn't explicitly clean up, so you will be left with an NCO install to play around with that will also clean up after itself when you exit.

As a side note, I've added the NCOMAKETEST environmental variable and am currently building the new docker images that support it. They will be up later today. If you pass -e NCOMAKETEST to the docker run, it will also run make test and parse for the Unidata-related output.

WardF commented 6 years ago

Ah, yes, also pass -v [path to root of local netcdf directory]:/netcdf-c to test against your local repo instead of pulling down from github.

edhartnett commented 6 years ago

@czender my rename tests complete, and don't leave a corrupt file. As you can see, the test re-opens the file and checks, and also I've done an ncdump of the file and it is fine. So I still need to reproduce this in a C test. I will see if I can follow Ward's instructions to get NCO built.

czender commented 6 years ago

@edhartnett the duplicate dimension IDs occur for other dimension elsewhere in the group hierarchy. Hence tests on a flat netCDF4 file without additional dimensions would not necessarily show anything abnormal in the renamed file.

edhartnett commented 6 years ago

@czender can you provide me with an ncdump -h for that file?

czender commented 6 years ago

zender@skyglow:~$ ncrename --log_level=5 -O -d lev,z -d lat,y -d lon,x ~/nco/data/in_grp.nc ~/foo.nc
zender@skyglow:~$ ncks --log_level=5 -H --trd -s %f -v one ~/foo.nc
ncks: INFO nco_bld_dmn_ids_trv() reports variable </g5/one_dmn_rec_var> with duplicate dimensions
ncks: ERROR netCDF file with duplicate dimension IDs detected. Please use netCDF version at least 4.3.0. NB: Simultaneously renaming multiple dimensions with ncrename can trigger this bug with netCDF versions up to 4.6.0.1 (current as of 20180201).
ncks: INFO reports group information
/: 11 subgroups, 7 dimensions, 1 record dimensions, 5 attributes, 18 variables
Fixed dimension name, size, ID = coord, 2, 3
Record dimension name, size, ID = time, 10, 4
Fixed dimension name, size, ID = vrt_nbr, 2, 5
Fixed dimension name, size, ID = gds_crd, 8, 6
Fixed dimension name, size, ID = y, 2, 7
Fixed dimension name, size, ID = x, 4, 8
Fixed dimension name, size, ID = z, 3, 9
/g1: 2 subgroups, 0 dimensions, 0 record dimensions, 1 attributes, 8 variables
/g1/g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 5 variables
/g1/g1:g2: 0 subgroups, 0 dimensions, 0 record dimensions, 2 attributes, 0 variables
/g2: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 3 variables
/g4: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 4 variables
/g5: 0 subgroups, 2 dimensions, 2 record dimensions, 0 attributes, 3 variables
Record dimension name, size, ID = time51, 2, 7
Record dimension name, size, ID = time52, 10, 8
/g6: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 2 variables
/g6/g6g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g7: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 6 variables
/g7/g7g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g8: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1/g8g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g8/g8g1/g8g1g1/g8g1g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 2 attributes, 0 variables
/g9: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 1 variables
/g9/g9g1/g9g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 1 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1/g9g1g1g1g1g1: 1 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 0 variables
/g9/g9g1/g9g1g1/g9g1g1g1/g9g1g1g1g1/g9g1g1g1g1g1/g9g1g1g1g1g1g1: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 4 variables
/g10: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 2 variables
/g11: 0 subgroups, 0 dimensions, 0 record dimensions, 0 attributes, 17 variables
/g12: 0 subgroups, 0 dimensions, 0 record dimensions, 20 attributes, 0 variables

ncks: INFO reports variable information
Segmentation fault (core dumped)
zender@skyglow:~$ ncdump -h ~/foo.nc
netcdf foo {
dimensions:
    coord = 2 ;
    time = UNLIMITED ; // (10 currently)
    vrt_nbr = 2 ;
    gds_crd = 8 ;
    y = 2 ;
    x = 4 ;
    z = 3 ;
variables:
    double ppc_dbl(time) ;
        ppc_dbl:long_name = "Precision-Preserving Compression, big numbers" ;
        ppc_dbl:purpose = "test --ppc switches" ;
        ppc_dbl:original_values = "123456789e-10,123456789e-9,123456789e-8,123456789e-7,123456789e-6,123456789e-5,123456789e-4,123456789e-3,123456789e-2,123456789e-1" ;
    float ppc_flt(time) ;
        ppc_flt:long_name = "Precision-Preserving Compression, single precision" ;
        ppc_flt:purpose = "test --ppc switches" ;
        ppc_flt:original_values = "0.0,0.1,0.12,0.123,0.1234,0.12345,0.123456,0.1234567,0.12345678,0.123456789" ;
    double ppc_big(time) ;
    double time(time) ;
    float lat(y) ;
        lat:units = "degrees_north" ;
    float lon(x) ;
        lon:units = "degrees_east" ;
    float lev(z) ;
        lev:units = "hybrid_sigma_pressure" ;
        lev:bounds = "lev_bnd" ;
    float lev_bnd(z, vrt_nbr) ;
        lev_bnd:purpose = "Cell boundaries for lev coordinate" ;
    float area(y) ;
        area:units = "meter2" ;
    float non_coord(coord) ;
        non_coord:purpose = "Test whether netCDF4 supports renaming non-coordinates to coordinates" ;
    float one ;
    float val_one_mss(y) ;
        val_one_mss:long_name = "one regular value, one missing value" ;
        val_one_mss:_FillValue = 1.e+36f ;
    float scl ;
    int unique ;
        unique:purpose = "the only variable of this name in this file, to test smallest possible access requests" ;
    float lat_lon(y, x) ;
    float att_var(time) ;
        att_var:byte_att = 0b, 1b, 2b, 127b, -128b, -127b, -2b, -1b ;
        att_var:char_att = "Sentence one.\nSentence two.\n" ;
        att_var:short_att = 37s ;
        att_var:int_att = 73 ;
        att_var:long_att = 73 ;
        att_var:float_att = 73.f, 72.f, 71.f, 70.01f, 69.001f, 68.01f, 67.01f ;
        att_var:double_att = 73., 72., 71., 70.01, 69.001, 68.01, 67.010001 ;
    int att_var_jsn ;
        att_var_jsn:char\ att\ with\ whitespace = "cf-json.org <http://cf-json.org>" ;
        att_var_jsn:double\ att\ with\ whitespace = 3.14 ;
        att_var_jsn:int\ att\ with\ whitespace = 1 ;
        att_var_jsn:int_array\ att\ with\ whitespace = 1, 2 ;
        string att_var_jsn:string_array\ att\ with\ whitespace = "1", "2" ;
    int att_var_spc_chr ;
        att_var_spc_chr:space\ in\ name = "foo" ;
        att_var_spc_chr:comma_in_name\, = "foo" ;
        att_var_spc_chr:lt_in_name\< = "foo" ;
        att_var_spc_chr:gt_in_name\> = "foo" ;
        att_var_spc_chr:hash_in_name\# = "foo" ;
        att_var_spc_chr:xclaim_in_name\! = "foo" ;
        att_var_spc_chr:dollar_in_name\$ = "foo" ;
        att_var_spc_chr:ampersand_in_name\& = "foo" ;
        att_var_spc_chr:equals_in_name\= = "foo" ;
        att_var_spc_chr:semicolon_in_name\; = "foo" ;
        att_var_spc_chr:colon_in_name\: = "foo" ;
        att_var_spc_chr:lbrace_in_name\{ = "foo" ;
        att_var_spc_chr:rbrace_in_name\} = "foo" ;
        att_var_spc_chr:lparen_in_name\( = "foo" ;
        att_var_spc_chr:rparen_in_name\) = "foo" ;
        att_var_spc_chr:lbracket_in_name\[ = "foo" ;
        att_var_spc_chr:rbracket_in_name\] = "foo" ;
        att_var_spc_chr:plus_in_name+ = "foo" ;
        att_var_spc_chr:hyphen_in_name- = "foo" ;
        att_var_spc_chr:at_in_name@ = "foo" ;

// global attributes:
        :Conventions = "CF-1.0" ;
        :julian_day = 200000.04 ;
        :RCS_Header = "$Header$" ;
        :history = "Mon Feb  5 11:18:04 2018: ncrename --log_level=5 -O -d lev,z -d lat,y -d lon,x /home/zender/nco/data/in_grp.nc /home/zender/foo.nc\nHistory global attribute.\n" ;
        :NCO = "4.7.3-alpha03" ;

group: g1 {
  variables:
    double ppc_dbl(time) ;
        ppc_dbl:long_name = "Precision-Preserving Compression, big numbers" ;
        ppc_dbl:purpose = "test --ppc switches" ;
        ppc_dbl:original_values = "123456789e-10,123456789e-9,123456789e-8,123456789e-7,123456789e-6,123456789e-5,123456789e-4,123456789e-3,123456789e-2,123456789e-1" ;
    float ppc_flt(time) ;
        ppc_flt:long_name = "Precision-Preserving Compression, single precision" ;
        ppc_flt:purpose = "test --ppc switches" ;
        ppc_flt:original_values = "0.0,0.1,0.12,0.123,0.1234,0.12345,0.123456,0.1234567,0.12345678,0.123456789" ;
    double ppc_big(time) ;
    float lon(x) ;
        lon:units = "degrees_east" ;
    float scl ;
    int g1v1 ;
    int g1v2 ;
    int v1 ;

  // group attributes:
        :history = "History group attribute.\n" ;

  group: g1g1 {
    variables:
        double ppc_dbl(time) ;
            ppc_dbl:long_name = "Precision-Preserving Compression, big numbers" ;
            ppc_dbl:purpose = "test --ppc switches" ;
            ppc_dbl:original_values = "123456789e-10,123456789e-9,123456789e-8,123456789e-7,123456789e-6,123456789e-5,123456789e-4,123456789e-3,123456789e-2,123456789e-1" ;
        float ppc_flt(time) ;
            ppc_flt:long_name = "Precision-Preserving Compression, single precision" ;
            ppc_flt:purpose = "test --ppc switches" ;
            ppc_flt:original_values = "0.0,0.1,0.12,0.123,0.1234,0.12345,0.123456,0.1234567,0.12345678,0.123456789" ;
        double ppc_big(time) ;
        float scl ;
        int v1 ;
    } // group g1g1

  group: g1\:g2 {

    // group attributes:
            :purpose = "group name with semi-special character, a colon (makes CDL-parsing hard)" ;
            :csznote = "As of 20131006, ncks skips groups whose names contain special characters. ncdump handles them fine. e.g.,\nncgen -k netCDF-4 -b -o ~/nco/data/in_grp.nc ~/nco/data/in_grp.cdl\nncks --cdl -m -g g1 ~/nco/data/in_grp.nc | m\nncdump -h -g g1 ~/nco/data/in_grp.nc | m\n" ;
    } // group g1\:g2
  } // group g1

group: g2 {
  variables:
    double time(time) ;
    float lon(x) ;
    float scl ;
  } // group g2

group: g4 {
  variables:
    double ppc_dbl(time) ;
        ppc_dbl:long_name = "Precision-Preserving Compression, big numbers" ;
        ppc_dbl:purpose = "test --ppc switches" ;
        ppc_dbl:original_values = "123456789e-10,123456789e-9,123456789e-8,123456789e-7,123456789e-6,123456789e-5,123456789e-4,123456789e-3,123456789e-2,123456789e-1" ;
    float ppc_flt(time) ;
        ppc_flt:long_name = "Precision-Preserving Compression, single precision" ;
        ppc_flt:purpose = "test --ppc switches" ;
        ppc_flt:original_values = "0.0,0.1,0.12,0.123,0.1234,0.12345,0.123456,0.1234567,0.12345678,0.123456789" ;
    double ppc_big(time) ;
    int one_dmn_rec_var(time) ;
        one_dmn_rec_var:long_name = "one dimensional record variable" ;
        one_dmn_rec_var:units = "kelvin" ;
  } // group g4

group: g5 {
  dimensions:
    time51 = UNLIMITED ; // (2 currently)
    time52 = UNLIMITED ; // (10 currently)
  variables:
    int one_dmn_rec_var(time52) ;
        one_dmn_rec_var:long_name = "one dimensional record variable" ;
        one_dmn_rec_var:units = "kelvin" ;
    double time51(time51) ;
    double time52(time52) ;
  } // group g5

group: g6 {
  variables:
    float area(y) ;
    float area1(y) ;

  group: g6g1 {
    variables:
        float area(y) ;
    } // group g6g1
  } // group g6

group: g7 {
  variables:
    double ppc_dbl(time) ;
        ppc_dbl:long_name = "Precision-Preserving Compression, big numbers" ;
        ppc_dbl:purpose = "test --ppc switches" ;
        ppc_dbl:original_values = "123456789e-10,123456789e-9,123456789e-8,123456789e-7,123456789e-6,123456789e-5,123456789e-4,123456789e-3,123456789e-2,123456789e-1" ;
    float ppc_flt(time) ;
        ppc_flt:long_name = "Precision-Preserving Compression, single precision" ;
        ppc_flt:purpose = "test --ppc switches" ;
        ppc_flt:original_values = "0.0,0.1,0.12,0.123,0.1234,0.12345,0.123456,0.1234567,0.12345678,0.123456789" ;
    double ppc_big(time) ;
    float gds_crd(gds_crd) ;
        gds_crd:long_name = "Geodesic coordinate" ;
        gds_crd:units = "degree" ;
        gds_crd:purpose = "enumerated coordinate like those that might define points in a geodesic grid" ;
        gds_crd:coordinates = "lat_gds lon_gds" ;
    double lat_gds(gds_crd) ;
        lat_gds:units = "degree" ;
        lat_gds:long_name = "Latitude" ;
        lat_gds:standard_name = "latitude" ;
        lat_gds:purpose = "1-D latitude coordinate referred to by geodesic grid variables" ;
    double lon_gds(gds_crd) ;
        lon_gds:long_name = "Longitude" ;
        lon_gds:standard_name = "longitude" ;
        lon_gds:units = "degree" ;
        lon_gds:purpose = "1-D longitude coordinate referred to by geodesic grid variables" ;

  group: g7g1 {
    variables:
        float gds_var(gds_crd) ;
            gds_var:units = "meter" ;
            gds_var:coordinates = "lat_gds lon_gds" ;
    } // group g7g1
  } // group g7

group: g8 {

  group: g8g1 {

    group: g8g1g1 {

      group: g8g1g1g1 {

        // group attributes:
                :mtd_grp = "Group metadata from g8g1g1g1, a leaf-group with no variables, to test whether metadata-only leaf groups are copied and/or printed" ;
                :answer = "Twerking" ;
        } // group g8g1g1g1
      } // group g8g1g1
    } // group g8g1
  } // group g8

group: g9 {

  group: g9g1 {
    variables:
        int v6 ;

    group: g9g1g1 {

      group: g9g1g1g1 {

        // group attributes:
                :mtd_grp = "Group metadata from g9g1g1g1, a group with no variables, to test whether group metadata are copied to ancestor groups of extracted variables" ;

        group: g9g1g1g1g1 {

          group: g9g1g1g1g1g1 {

            group: g9g1g1g1g1g1g1 {
              variables:
                double ppc_dbl(time) ;
                    ppc_dbl:long_name = "Precision-Preserving Compression, big numbers" ;
                    ppc_dbl:purpose = "test --ppc switches" ;
                    ppc_dbl:original_values = "123456789e-10,123456789e-9,123456789e-8,123456789e-7,123456789e-6,123456789e-5,123456789e-4,123456789e-3,123456789e-2,123456789e-1" ;
                float ppc_flt(time) ;
                    ppc_flt:long_name = "Precision-Preserving Compression, single precision" ;
                    ppc_flt:purpose = "test --ppc switches" ;
                    ppc_flt:original_values = "0.0,0.1,0.12,0.123,0.1234,0.12345,0.123456,0.1234567,0.12345678,0.123456789" ;
                double ppc_big(time) ;
                int v7 ;
              } // group g9g1g1g1g1g1g1
            } // group g9g1g1g1g1g1
          } // group g9g1g1g1g1
        } // group g9g1g1g1
      } // group g9g1g1
    } // group g9g1
  } // group g9

group: g10 {
  variables:
    float two_dmn_rec_var(time, z) ;
    float three_dmn_rec_var(time, y, x) ;
        three_dmn_rec_var:units = "watt meter-2" ;
  } // group g10

group: g11 {
  variables:
    byte byte_var ;
        byte_var:long_name = "byte-type variable" ;
    char char_var ;
        char_var:long_name = "char-type variable" ;
    char char_var_arr(time) ;
        char_var_arr:long_name = "char-type variable array" ;
    int int_var ;
        int_var:long_name = "int-type variable" ;
    short short_var ;
        short_var:long_name = "short-type variable" ;
    int long_var ;
        long_var:long_name = "long-type variable" ;
        long_var:purpose = "Variable of CDL type=long, which is deprecated for int. Included to test back-compatibility" ;
    double double_var ;
        double_var:long_name = "double-type variable" ;
    float float_var ;
        float_var:long_name = "float-type variable" ;
    int64 int64_var ;
        int64_var:long_name = "int64-type variable" ;
    string string_var ;
        string_var:long_name = "string-type variable" ;
    string string_arr(y) ;
        string_arr:long_name = "string-type array variable" ;
    string string_rec_arr(time) ;
        string_rec_arr:long_name = "string-type record array variable" ;
    ubyte ubyte_var ;
        ubyte_var:long_name = "ubyte-type variable" ;
    uint uint_var ;
        uint_var:long_name = "uint-type variable" ;
        uint_var:_FillValue = 73U ;
        uint_var:purpose = "_FillValue attribute tests whether NcML parser inadvertently creates two _FillValues for unsigned types" ;
    uint uint_arr(y) ;
        uint_arr:long_name = "uint-type array variable" ;
    uint64 uint64_var ;
        uint64_var:long_name = "uint64-type variable" ;
    ushort ushort_var ;
        ushort_var:long_name = "ushort-type variable" ;
  } // group g11

group: g12 {

  // group attributes:
        :space\ in\ name = "foo" ;
        :comma_in_name\, = "foo" ;
        :lt_in_name\< = "foo" ;
        :gt_in_name\> = "foo" ;
        :hash_in_name\# = "foo" ;
        :xclaim_in_name\! = "foo" ;
        :dollar_in_name\$ = "foo" ;
        :ampersand_in_name\& = "foo" ;
        :equals_in_name\= = "foo" ;
        :semicolon_in_name\; = "foo" ;
        :colon_in_name\: = "foo" ;
        :lbrace_in_name\{ = "foo" ;
        :rbrace_in_name\} = "foo" ;
        :lparen_in_name\( = "foo" ;
        :rparen_in_name\) = "foo" ;
        :lbracket_in_name\[ = "foo" ;
        :rbracket_in_name\] = "foo" ;
        :plus_in_name+ = "foo" ;
        :hyphen_in_name- = "foo" ;
        :at_in_name@ = "foo" ;
  } // group g12
}

```zender@skyglow:~$ 
czender commented 6 years ago

This new bug report demonstrates that some change in libnetcdf 4.6.1 or 4.6.2-devel fixed a (previously unknown) renaming issue for 64bit-offset files. I thank @edhartnett. I hope progress on netCDF4 renaming can be maintained.