Open jhamman opened 4 years ago
From @ekluzek:
There's some documentation in the User's Guide about how to do this...
There are also some README files in the tools directory about the different tools needed.
Do any of these grids have open ocean that you want to exclude CLM from running over? If there's a small amount of ocean, the process is a little easier if you just run CLM as wetland over ocean areas. If there's a lot of ocean, that will obviously be a waste of CPU time.
Here I assume you DO have ocean, the process simplifies a bit if you don't...
Create a SCRIP grid file for the land area Do the same for the ocean area Create mapping files between the land and ocean (cime/tools/mapping/gen_mapping_files or there is a script to create a unit conversion mapping file when there is no ocean (tools/mkmapdata/mkunitymap.ncl) Create a domain file from that mapping file (gen_domain under cime/tools/mapping/gen_domain_files) Create mapping files for mksurfdata_map with mkmapdata.sh (under tools/mkmapdata) Run mksurfdata_map to create your surface dataset
See the README file in the CTSM checkout under the tools subdirectory. Each of the tools has a README file for it as well.
From me:
A bit more info and one follow on question. Our domain does have about 50% ocean grid cells. We're starting from an existing domain file (e.g. /glade/p/ral/hap/common_data/rasm/inputdata/share/domains/domain.lnd.wr50a_ar9v4.130607.nc).
So my question is: where do I plug in this domain file? I'll check with Tony but I assume we have SCRIP grid files for the land/ocean if we strictly need them but it would be nice to start with this domain file if possible.
From Erik...
Well, in the model there's XML variables LND_DOMAIN_FILE and OCN_DOMAIN_FILE that you set to point to the files, as well as variables for the PATH to those files.
Do
./xmlquery -p DOMAIN
in a case to see all the domain related variables.
I don't know of a tool to convert a domain file into a SCRIP file. But, you should be able to gin up something to do that.
Now, the other thing though is that you can often find the paths to files by doing "ncdump -h" and looking at the header. Doing that pointed me to the mapping file here that I grabbed from svn inputdata: /glade/p/cesmdata/cseg/inputdata/cpl/cpl6/map_ar9v4_to_wr50a_aave_da_100920.nc What I had hoped was that doing an ncdump on that file would show the files that created it (which it usually does). But, in this case it doesn't. But, I do seem some files in this directory that might be what you're looking for:
/glade/p/cesmdata/cseg/mapping/grids
ar9v4_100920.nc wr50a_ar9v4_130417.nc
I think those are the SCRIP grid files that you need to use.
I'm starting to walk through these steps.
cheyenne:~/projects/rasm-nna/cesm/components/clm/tools/mkmapdata ((ctsm1.0.dev080))$ \
./mkmapdata.sh -f /glade/p/cesmdata/cseg/mapping/grids/wr50a_ar9v4_130417.nc --res usrspec --gridtype regional
This is running now after I made a few changes to the mkmapdata.sh
shell script:
diff --git a/tools/mkmapdata/mkmapdata.sh b/tools/mkmapdata/mkmapdata.sh
index 436b504e..19995ecc 100755
--- a/tools/mkmapdata/mkmapdata.sh
+++ b/tools/mkmapdata/mkmapdata.sh
@@ -342,8 +342,9 @@ case $hostname in
if [ -z "$REGRID_PROC" ]; then
REGRID_PROC=36
fi
+ echo "loading modules now"
esmfvers=7.1.0r
- intelvers=17.0.1
+ intelvers=18.0.5
module load esmf_libs/$esmfvers
module load intel/$intelvers
module load ncl
@@ -515,7 +516,7 @@ until ((nfile>${#INGRID[*]})); do
cmd="$cmd --dst_regional"
fi
- cmd="$cmd --src_type ${SRC_TYPE[nfile]} ${SRC_EXTRA_ARGS[nfile]} --dst_type $DST_TYPE $DST_EXTRA_ARGS"
+ cmd="$cmd ${SRC_EXTRA_ARGS[nfile]} $DST_EXTRA_ARGS"
cmd="$cmd $lrgfil"
runcmd $cmd
I have generated some wr50a datasets for CTSM. I'm trying to build and run RASM-NNA now. To generate the datasets, I did the following which is all subject to change depending whether it works or not.
CTSM wr50a input dataset generation.
Checkout CTSM or CESM (cannot use CTSM in RASM because it does not contain CIME), get master
- git clone https://github.com/ESCOMP/ctsm ctsm_master
Need scrip grid file
- wr50a_090615.nc already exists
Run mkmapdata tool
- needs to be run in batch due to memory use
- run on cheyenne
- regional grids should be run on 1 pe
- modify modules as needed in mkmapdata.sh, use intelvers=18.0.5, esmfvers=8.0.0
- want to run the following in batch mode, ./mkmapdata.sh -f $CESMDATAROOT/mapping/grids/wr50a_090614.nc --res wr50a --gridtype regional
- submit a job that runs
- cd /glade/work/tcraig/NNA/ctsm_master/tools/mkmapdata
- qsub go.makemapdata.csh
Run mksurfdata tool
- can run interactively on cheyenne ./mksurfdata.pl -res usrspec -usr_gname wr50a -usr_gdate 200811
The process seems to be working as outlined above. CTSM datasets have been created for the 12km NNA configuration and are starting to be validated.
We'll need to prepare CTSM input datasets for multiple new model domains. What is the procedure for doing this? Is this documented somewhere? I'm assuming this was recently done for the LILAC project (CTSM/WRF coupling).
cc @apcraig, @billsacks, @ekluzek