Open brian-rose opened 6 years ago
Right now it only supports rectangular, LLC and ASTE grids.
But it should be pretty easy to extend it to work with CS, which is actually simpler than those other two. All the hard parts of this are done. @raphaeldussin is currently working on refactoring the internals (see #87, #89, #90).
Do you, or a someone in your group, have time to devote to this? If so, it can be done quickly.
I might be able to put a bit of time into this, if you can hold my hand :-) Would this be a valuable contribution? I'm not sure how widespread the use of CS is these days.
Weighing this against using mnc output and existing (non-xarray) python scripts for our project. Or [shudder] going back to matlab.
The new internals allow for an arbitrary numbers of faces and padding if necessary. I have looked at the documentation here cube sphere and I think it might work out of the box. Do you have a test dataset ? or is it close enough from one of the test cases (MITgcm/verification) ?
My case is close to MITgcm/verification/cpl_aim+ocn
I tested the ocean part (global_ocean.cs32x15) with my latest code and it almost worked. The only issue is that the llc option (that allows the faces) assumes nx = ny for each face. Once we got all those PR in, it should take a few lines to get the cube sphere working.
Because the numpy / xarray / netCDF data models do not permit "ragged arrays", it is necessary for all "faces" in an xarray representation of these grids to have the same shape. This is not how MITgcm represents its "tiles" internally. For example, for the LLC grids, there are only five logical tiles, with the following shapes and orientations:
tile 0 tile 1 tile 2
(270,90) (270,90) (90, 90)
###### ###### ######
###### ###### ######
###### ###### ######
###### ######
###### ######
###### ######
###### ######
###### ######
###### ######
tile 3 (90, 270)
##############################
##############################
##############################
tile 4 (90, 270)
##############################
##############################
##############################
In order to fit this into a single xarray dataset, I chose to break up the tiles into their lowest common denominator. These are called faces. Because the smallest tile in the LLC grid is square (the arctic tile, 90x90), this constrains the geometry of the other faces. So there are 13 90x90 "faces" in an xmitgcm llc dataset:
My understanding is that the only difference between the LLC grid and the CS grid is that the CS grid contains a tile for the south pole, upping the number of faces to 14.
The only issue is that the llc option (that allows the faces) assumes nx = ny for each face
I can't see how to get around this. What is the original shape of the CS tiles?
I think the CS grid is simpler. It's just 6 square tiles, i.e. nx = ny for each tile already. E.g. the horizontal grid for MITgcm/verification/global_ocean.cs32x15 is 6x32x32
Good news! That maps much more cleanly to the xarray data model. So "assuming nx = ny" for each face should definitely not cause a problem..
I got confused with the number of tiles in the test case (12 tiles of 16x32)... I can make xmitgcm read the data but it doesn't look right. Is it possible that they write into mds files in a different order ? (face, z, y, x ) instead of the llc (z, face, y, x) ?
The answer might be buried in here: https://github.com/MITgcm/MITgcm/blob/master/pkg/exch2/w2_set_cs6_facets.F
Hi @brian-rose ,
sorry for the delay! I have been working on this today. The cube sphere output goes like this: (z,y,face,x). Although it is not straightforward to make it work with dask (lazy evaluation of data) in the current implementation of xmitgcm, I have found a way to make it work eagerly. We're in the process of refactoring the low level functions but I can send a new PR with CS support once we're done.
Great! For our purposes we don't need lazy evaluation since the grids are small. But let me know I can help with anything.
Does xmitgcm support reading output on the Cubed Sphere? I suspect the answer is no since I didn't see any thing in the docs about this.
We are starting up some new work with a coupled CS grid MITgcm setup, and trying to figure out the best python-based analysis workflow.