Closed 1731625710 closed 1 year ago
Most of the experiments contain symbolic links to the input files. Most likely your grid_spec.nc
file does not exist.
You will need to download the data files and then create a .datasets
symbolic link to the files.
More information here: https://github.com/NOAA-GFDL/MOM6-examples/wiki/Getting-started#downloading-input-data
Thanks for your comments. I have checked the INPUT file in OM4_05 and the grid_spec.nc file exists.
Have you also confirmed that the symbolic link is working? And that .datasets
has been set up properly?
I just created a .datasets link following the instructions in the examples. I don't know if this is proper.
Linking data sets on Gaea cd MOM6-examples/
ln -sf /lustre/f2/pdata/gfdl/gfdl_O/datasets .datasets
Linking data sets on GFDL workstatons or GFDL PAN cd MOM6-examples/ ln -sf /archive/gold/datasets .datasets
Those instructions look correct to me (for Gaea at least). To verify, you can use ncdump
to verify that the file is readable:
$ ncdump -h INPUT/grid_spec.nc
You should see ocn_mosaic_file
in the output. If not, then there is probably something wrong with the file.
By the way, OM4_05 is configured to run on 420 cores, and it not set up torun on a single core. I would expect you to get a different error if it were to get past this point and launch the job on 1 rank.
If you are just getting started with MOM6, you might want to see more general support either within your group or at the MOM6 community forums:
This appeared to be a file download issue, so I will close.
Hi, I ran into this problem when running test case (OM4_05) in ice_ocean_SIS2 using 'mpirun -n 1 ../../build/gnu/ice_ocean_SIS2/repro/MOM6'. I got the following error message:
FATAL: grid_mod/get_grid_version: Can't determine the version of the grid spec: none of "x_T", "geolon_t", or "ocn_mosaic_file" exist in file "INPUT/grid_spec.nc"
FATAL: grid_mod/get_grid_version: Can't determine the version of the grid spec: none of "x_T", "geolon_t", or "ocn_mosaic_file" exist in file "INPUT/grid_spec.nc"
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them.
Let me know, how to fix this? Thank you.