NCAR / i-wrf

Integrated End-To-End WRF Containerized Framework
https://i-wrf.org
Apache License 2.0
9 stars 1 forks source link

Run Hurricane Matthew test case in WRF container on Derecho #46

Open jaredalee opened 7 months ago

jaredalee commented 7 months ago

Describe the Task

Using the WRF container for a test on Derecho, simulate the Hurricane Matthew test case from the WRF Online Tutorial (https://www2.mmm.ucar.edu/wrf/OnLineTutorial/CASES/SingleDomain/index.php). Document the steps/commands needed to complete the execution. Attempt to use two nodes when running WRF for this case to demonstrate full multi-node capability of the WRF container. Make special note whether nodes=2:ppn=128 works fine on Derecho, or if fewer processors per node than 128 are needed due to the relatively small domain/grid size of this test case (if domain patches are too small due to requesting too many processors/tasks, then WRF will stop immediately).

Time Estimate

1–2 days

Sub-Issues

Consider breaking the task down into sub-issues.

Relevant Deadlines

Target completion 1 Mar 2024.

Funding Source

7790013

Define the Metadata

Assignee

Labels

Projects and Milestone

Task Checklist

jaredalee commented 4 months ago

@hahnd @rcplane In order for the METplus container (#48) to read in data properly, use this namelist.input and vars_io.txt file from the attached .tar.gz file to write out pressure-level diagnostics files (wrfout_plev). It will also create height-level diagnostics files (wrfout_zlev), as a demonstration of capability. As a side benefit, the vars_io.txt file also suppresses numerous extraneous variables from the wrfout files to make it less cluttered, further demonstrating some capabilities.

matthew_config.tar.gz

georgemccabe commented 4 months ago

@jaredalee, are the only inputs needed to run WRF included in the tar file you provided (namelist.input and vars_io.txt)? If so, I would suggest that we add those files to the NCAR/i-wrf repository under use_cases/Hurricane_Matthew/WRF since they are small. That way the only input data we would need to provide separately for this use case would be the MADIS observation files.

jaredalee commented 4 months ago

@georgemccabe There are other inputs needed to run WPS (geogrid, ungrib, metgrid), real, and WRF, such as GFS model data for ICs/LBCs. The WRF Online Tutorial site has a link that can be used to download that data. So we could either ask users to download the data from there themselves, or we could also host all the required data ourselves, which could be insurance against MMM revamping the tutorial and updating it with a newer case. There's also a WPS namelist that's required, but that's just a small text file like the WRF namelist. We could either store the namelists in our repo somewhere for download, or include instructions on how to modify the default namelists that will get installed with WPS & WRF. It all depends on how easy it will be for users to move, replace, edit, or symlink to files within or outside the container image, and that's something I don't currently have a good feel for.