Closed saeed-moghimi-noaa closed 1 year ago
@zacharyburnettNOAA Please let the team knows the location of full runs for ATMESH+ADC for 120 m and 250 m cases by COB Aug 4, 2021 on Hera. @pvelissariou1 Please let the team knows the location of full runs for PAHM+ADC for 120 m and 250 m cases by COB Aug 4, 2021 on Orion.
I'm currently running the following Florence runs:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_atmesh_ww3data_250m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120msubset
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_250m
when they finish I will make an example run to serve as a model. I also sent an email to Andre asking for ATMESH forcing for the 120m mesh, I will wait for his reply and make a new run
Thank you Zach.
On Orion I have the 120m/250m/120m_subsetted for florence and sandy ready. We need to compare the results from orion and hera. For run_20210804_florence_atmesh_ww3data_250m what is the atmospheric forcing you are using? At the moment the orion jobs are pending on queue.
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Wed, Aug 4, 2021 at 9:31 AM Zachary Burnett @.***> wrote:
I'm currently running the following Florence runs:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_atmesh_ww3data_250m /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120m /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120msubset /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_250m
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-892709128, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TP3KIUXCHTYUB74OQSTT3FFNFANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
Hi Takis,
Would you please share the location of the runs on Orion?
Thanks, -Saeed
Saeed Moghimi, PhD UCAR/NOAA - NOS Storm Surge Modeling https://nauticalcharts.noaa.gov/learn/storm-surge-modeling.html Team Lead Coastal Marine Modeling Branch, Coast Survey Development Laboratory, Office of Coast Survey at NOAA National Ocean Service. Address: 1315 East West Hwy, Room 6607, Silver Spring, Maryland 20910 Phone: (240) 847-8230
The contents of this message are mine personally and do not necessarily reflect any position of NOAA.
On Wed, Aug 4, 2021 at 11:12 AM Panagiotis Velissariou < @.***> wrote:
Thank you Zach.
On Orion I have the 120m/250m/120m_subsetted for florence and sandy ready. We need to compare the results from orion and hera. For run_20210804_florence_atmesh_ww3data_250m what is the atmospheric forcing you are using? At the moment the orion jobs are pending on queue.
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Wed, Aug 4, 2021 at 9:31 AM Zachary Burnett @.***> wrote:
I'm currently running the following Florence runs:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_atmesh_ww3data_250m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120msubset
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_250m
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub < https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-892709128 , or unsubscribe < https://github.com/notifications/unsubscribe-auth/APC7TP3KIUXCHTYUB74OQSTT3FFNFANCNFSM5BGTF6UA
. Triage notifications on the go with GitHub Mobile for iOS < https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675
or Android < https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email
.
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-892741867, or unsubscribe https://github.com/notifications/unsubscribe-auth/APZULDYVI45P5D5VWF3AVWDT3FKGBANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
Hi Saeed,
All the data, configuration files and eventually the model simulation files are in: /work/noaa/nosofs/pvelissa/hsofs_tests I have given read permissions to /work/noaa/nosofs/pvelissa
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Wed, Aug 4, 2021 at 10:17 AM Saeed Moghimi @.***> wrote:
Hi Takis,
Would you please share the location of the runs on Orion?
Thanks, -Saeed
Saeed Moghimi, PhD UCAR/NOAA - NOS Storm Surge Modeling https://nauticalcharts.noaa.gov/learn/storm-surge-modeling.html Team Lead Coastal Marine Modeling Branch, Coast Survey Development Laboratory, Office of Coast Survey at NOAA National Ocean Service. Address: 1315 East West Hwy, Room 6607, Silver Spring, Maryland 20910 Phone: (240) 847-8230
The contents of this message are mine personally and do not necessarily reflect any position of NOAA.
On Wed, Aug 4, 2021 at 11:12 AM Panagiotis Velissariou < @.***> wrote:
Thank you Zach.
On Orion I have the 120m/250m/120m_subsetted for florence and sandy ready. We need to compare the results from orion and hera. For run_20210804_florence_atmesh_ww3data_250m what is the atmospheric forcing you are using? At the moment the orion jobs are pending on queue.
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Wed, Aug 4, 2021 at 9:31 AM Zachary Burnett @.***> wrote:
I'm currently running the following Florence runs:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_atmesh_ww3data_250m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120msubset
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_250m
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub <
https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-892709128
, or unsubscribe <
https://github.com/notifications/unsubscribe-auth/APC7TP3KIUXCHTYUB74OQSTT3FFNFANCNFSM5BGTF6UA
. Triage notifications on the go with GitHub Mobile for iOS <
https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675
or Android <
https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email
.
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub < https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-892741867 , or unsubscribe < https://github.com/notifications/unsubscribe-auth/APZULDYVI45P5D5VWF3AVWDT3FKGBANCNFSM5BGTF6UA
. Triage notifications on the go with GitHub Mobile for iOS < https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675
or Android < https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email
.
— You are receiving this because you were assigned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-892745570, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TPZNM2JLY3YS6RQKIK3T3FKXDANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
Andre sent me the location of a 120m ATMESH forcing file, so I also added that run:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_atmesh_120msubset
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_atmesh_ww3data_250m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_120msubset
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210804_florence_besttrack_250m
It looks like all of these runs ran out the 6 hour time limit overnight, and did not complete.
I'm rerunning these runs here:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_atmesh_120msubset
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_atmesh_250m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_besttrack_120msubset
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_besttrack_250m
@pvelissariou1 suggested removing --login
from the shebang, and when I did that the runs are now all running smoothly.
it looks like run_20210805_florence_besttrack_120msubset
finished successfully with the following output file sizes:
(coupledmodeldriver) Zachary.Burnett@hfe11 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_besttrack_120msubset
> du runs/unperturbed/fort.*
31772 runs/unperturbed/fort.13
0 runs/unperturbed/fort.14
152 runs/unperturbed/fort.15
4 runs/unperturbed/fort.16
44 runs/unperturbed/fort.22
24 runs/unperturbed/fort.22.original
1135020 runs/unperturbed/fort.63.nc
2340736 runs/unperturbed/fort.64.nc
0 runs/unperturbed/fort.67.nc
0 runs/unperturbed/fort.68.nc
43900 runs/unperturbed/fort.80
and run_20210805_florence_besttrack_250m
is currently 73% complete:
Zachary.Burnett@hfe11 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_besttrack_250m
> tail runs/unperturbed/ADCIRC_HOTSTART_unperturbed.out.log -n 2
TIME STEP = 760000 73.05% COMPLETE ITERATIONS = 15 TIME = 0.15200000E+07
ELMAX = 3.5646E-001 AT NODE 1735 SPEEDMAX = 3.1848E-001 AT NODE 3250 ON MYPROC = 0
However, the two runs using ATMESH (run_20210805_florence_atmesh_120msubset
and run_20210805_florence_atmesh_250m
) appear to be stalled with no errors:
(coupledmodeldriver) Zachary.Burnett@hfe11 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_atmesh_120msubset
> cat runs/unperturbed/ADCIRC_HOTSTART_unperturbed.err.log
here is the full output of /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_atmesh_120msubset/runs/unperturbed/ADCIRC_HOTSTART_unperturbed.out.log
:
(coupledmodeldriver) Zachary.Burnett@hfe11 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_atmesh_120msubset
> cat runs/unperturbed/ADCIRC_HOTSTART_unperturbed.out.log
FILE_NAME >
/scratch2/COASTAL/coastal/save/com/atm/para/florence120m/Florence_HWRF_HRRR_EA1
20m.nc
INFO: Searching for ADCIRC subdomain directories:
INFO: Looking for './PE0000/fort.14' ...
INFO: File './PE0000/fort.14' was found!
INFO: The search for the subdomain directory was completed successfully.
INFO: The ROOTDIR is '.'.
INFO: The INPUTDIR is './PE0000'. INFO: The GBLINPUTDIR is '.'.
INFO: The GLOBALDIR is '.'.
INFO: The LOCALDIR is './PE0000'.
_______________________________________________________________________________
PROGRAM ADCIRC VERSION v55.00-45-g49debf8
AN ADVANCED CIRCULATION MODEL FOR SHELVES, COASTAL SEAS AND ESTUARIES
- DEVELOPED BY
R.A. LUETTICH, JR
UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL
INSTITUTE OF MARINE SCIENCES
J.J. WESTERINK
DEPARTMENT OF CIVIL ENGINEERING AND GEOLOGICAL SCIENCES
UNIVERSITY OF NOTRE DAME
- THE ADCIRC SOURCE CODE IS COPYRIGHTED BY
R.A. LUETTICH, JR. AND J.J. WESTERINK, 1994-2006
NO PART OF THIS CODE MAY BE REPRODUCED OR REDISTRIBUTED
WITHOUT THE WRITTEN PERMISSION OF THE AUTHORS
_______________________________________________________________________________
INFO: initializeMesh: THE NEIGHBOR TABLE IS BEING COMPUTED.
INFO: initializeMesh: THE NEIGHBOR TABLE IS COMPLETED. THE MINIMUM NUMBER OF NEIGHBORS FOR ANY NODE = 3. 1+THE MAXIMUM NUMBER OF NEIGHBORS FOR ANY NODE = 8. THE PARAMETER MNEI CAN BE SET AS SMALL AS 8.
ADCIRC Version is v55.00-45-g49debf8
INFO: openFileForRead: The file './maxele.63' was not found.
INFO: readAndMapToSubdomainMaxMin: Values from ./maxele.63 will not reflect the solution prior to this hotstart.
INFO: readAndMapToSubdomainMaxMinNetCDF: The file ./maxele.63.nccontains no data, so the min/max record will be started anew.
INFO: openFileForRead: The file './maxvel.63' was not found.
INFO: readAndMapToSubdomainMaxMin: Values from ./maxvel.63 will not reflect the solution prior to this hotstart.
INFO: readAndMapToSubdomainMaxMinNetCDF: The file ./maxvel.63.nccontains no data, so the min/max record will be started anew.
INFO: openFileForRead: The file './maxwvel.63' was not found.
INFO: readAndMapToSubdomainMaxMin: Values from ./maxwvel.63 will not reflect the solution prior to this hotstart.
INFO: openFileForRead: The file './minpr.63' was not found.
INFO: readAndMapToSubdomainMaxMin: Values from ./minpr.63 will not reflect the solution prior to this hotstart.
_______________________________________________________________________________
LIMITED RUNTIME INFORMATION SECTION
here is the nems.configure
for the 120m ATMESH run (/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_atmesh_120msubset/runs/unperturbed/nems.configure
# `nems.configure` generated with NEMSpy 1.0.3
# EARTH #
EARTH_component_list: ATM OCN
EARTH_attributes::
Verbosity = off
::
# ATM #
ATM_model: atmesh
ATM_petlist_bounds: 0 0
ATM_attributes::
Verbosity = off
::
# OCN #
OCN_model: adcirc
OCN_petlist_bounds: 1 600
OCN_attributes::
Verbosity = off
::
# Run Sequence #
runSeq::
@3600
ATM -> OCN :remapMethod=nearest_stod
ATM
OCN
@
::
update: the run at /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210805_florence_atmesh_250m/runs/unperturbed/ADCIRC_HOTSTART_unperturbed.out.log
completed successfully, so it looks like the runs WERE running, just without any in-run output or status updates until the very end
goal is to reproduce this folder: /scratch2/COASTAL/coastal/save/NAMED_STORMS/Florence_ADCIRC/hsofs120m/florence.atm2ocn.20210621.atmea120m/run
use redist
when ocean mesh and forcing mesh are the same
Zach, thanks for the update.
Please do an "ncdump -v time -t FILENAME" to see if you have data for all times. on Orion runs were completed with no errors but the resulting netcdf files had no data in them.
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Thu, Aug 5, 2021 at 11:04 AM Zachary Burnett @.***> wrote:
goal is to reproduce this folder: /scratch2/COASTAL/coastal/save/NAMED_STORMS/Florence_ADCIRC/hsofs120m/florence.atm2ocn.20210621.atmea120m/run
use redist when ocean mesh and forcing mesh are the same
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-893577492, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TP77MUB5MF5KISTUNY3T3KY7TANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
@zacharyburnettNOAA @pvelissariou1 All, Any update on the run folders? -Saeed
On orion, the job for sandy/hsofs250m started just a few hours ago. There was a typo in the adcirc.job slurm script (NEMS-pahm-adcirc.x instead of NEMS-pahm_adcirc.x) and that part failed. Test runs have been re-submitted (at 7:50am CT). Run failed due to: error while loading shared libraries: libesmf.so: cannot map zero-fill pages: Cannot allocate memory
due to esmf updates they did in the system?
I am investigating.
Zach is out today. Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 7:31 AM Saeed Moghimi @.***> wrote:
@zacharyburnettNOAA https://github.com/zacharyburnettNOAA @pvelissariou1 https://github.com/pvelissariou1 All, Any update on the run folders? -Saeed
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-894227241, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TP2WHVXY5C2NR3DTBTLT3PIYBANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
esmf/8.0.0 is not working on orion (duf to: libesmf.so not being a shared library). esmf/8.1.1 has been compiled for intel/openmpi at this point. I will recompile the CoastalApp with working versions of esmf and re-submit the jobs.
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 7:57 AM Panagiotis Velissariou - NOAA Affiliate < @.***> wrote:
On orion, the job for sandy/hsofs250m started just a few hours ago. There was a typo in the adcirc.job slurm script (NEMS-pahm-adcirc.x instead of NEMS-pahm_adcirc.x) and that part failed. Test runs have been re-submitted (at 7:50am CT). Run failed due to: error while loading shared libraries: libesmf.so: cannot map zero-fill pages: Cannot allocate memory
due to esmf updates they did in the system?
I am investigating.
Zach is out today. Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 7:31 AM Saeed Moghimi @.***> wrote:
@zacharyburnettNOAA https://github.com/zacharyburnettNOAA @pvelissariou1 https://github.com/pvelissariou1 All, Any update on the run folders? -Saeed
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-894227241, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TP2WHVXY5C2NR3DTBTLT3PIYBANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
Latest update:
CoastalApp/NEMS is not compatible with esmf/7.1.0r
Using esmf/7.0.0r on orion produces the following: Abort(1615759) on node 19 (rank 19 in comm 0): Fatal error in PMPI_Init_thread: Other MPI error, error stack: MPIR_Init_thread(703)........: MPID_Init(923)...............: MPIDI_OFI_mpi_init_hook(1287): MPIDU_bc_table_create(309)...: unable to create a business card In: PMI_Abort(1615759, Fatal error in PMPI_Init_thread: Other MPI error, error stack: MPIR_Init_thread(703)........: MPID_Init(923)...............: MPIDI_OFI_mpi_init_hook(1287): MPIDU_bc_table_create(309)...: unable to create a business card) DU_bc_table_create(309)...: unable to create a business cardwsrun: Job step aborted: Waiting up to 32 seconds for job step to finish. slurmstepd: error: STEP 2743193.0 ON Orion-07-26 CANCELLED AT 2021-08-06T10:48:16 srun: error: Orion-07-26: tasks 0-18,20-36: Killed srun: Terminating job step 2743193.0 srun: error: Orion-07-26: task 19: Exited with exit code 143 srun: error: Orion-07-27: tasks 37-73: Killed srun: error: Orion-22-57: tasks 111-147: Killed srun: error: Orion-22-60: tasks 221-256: Killed srun: error: Orion-22-59: tasks 185-220: Killed srun: error: Orion-22-58: tasks 148-184: Killed srun: error: Orion-22-56: tasks 74-110: Killed
Also version esmf/8.0.0 is not working either (I'll submit a ticket)
I am switching to the combination intel/openmpi esmf/8.1.1 to see what happens.
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 8:24 AM Panagiotis Velissariou - NOAA Affiliate < @.***> wrote:
esmf/8.0.0 is not working on orion (duf to: libesmf.so not being a shared library). esmf/8.1.1 has been compiled for intel/openmpi at this point. I will recompile the CoastalApp with working versions of esmf and re-submit the jobs.
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 7:57 AM Panagiotis Velissariou - NOAA Affiliate < @.***> wrote:
On orion, the job for sandy/hsofs250m started just a few hours ago. There was a typo in the adcirc.job slurm script (NEMS-pahm-adcirc.x instead of NEMS-pahm_adcirc.x) and that part failed. Test runs have been re-submitted (at 7:50am CT). Run failed due to: error while loading shared libraries: libesmf.so: cannot map zero-fill pages: Cannot allocate memory
due to esmf updates they did in the system?
I am investigating.
Zach is out today. Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 7:31 AM Saeed Moghimi @.***> wrote:
@zacharyburnettNOAA https://github.com/zacharyburnettNOAA @pvelissariou1 https://github.com/pvelissariou1 All, Any update on the run folders? -Saeed
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-894227241, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TP2WHVXY5C2NR3DTBTLT3PIYBANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
At this time, it seems that there is no working combination of intel/impi/hdf5/netcdf/esmf on orion to use to compile/run the CoastalApp. Please see my previous email. I will try "openmpi" instead of impi just to test the 120m/250m configurations.
I have submitted a detailed ticket to orion IT.
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 10:59 AM Panagiotis Velissariou - NOAA Affiliate < @.***> wrote:
Latest update:
CoastalApp/NEMS is not compatible with esmf/7.1.0r
Using esmf/7.0.0r on orion produces the following: Abort(1615759) on node 19 (rank 19 in comm 0): Fatal error in PMPI_Init_thread: Other MPI error, error stack: MPIR_Init_thread(703)........: MPID_Init(923)...............: MPIDI_OFI_mpi_init_hook(1287): MPIDU_bc_table_create(309)...: unable to create a business card In: PMI_Abort(1615759, Fatal error in PMPI_Init_thread: Other MPI error, error stack: MPIR_Init_thread(703)........: MPID_Init(923)...............: MPIDI_OFI_mpi_init_hook(1287): MPIDU_bc_table_create(309)...: unable to create a business card) DU_bc_table_create(309)...: unable to create a business cardwsrun: Job step aborted: Waiting up to 32 seconds for job step to finish. slurmstepd: error: STEP 2743193.0 ON Orion-07-26 CANCELLED AT 2021-08-06T10:48:16 srun: error: Orion-07-26: tasks 0-18,20-36: Killed srun: Terminating job step 2743193.0 srun: error: Orion-07-26: task 19: Exited with exit code 143 srun: error: Orion-07-27: tasks 37-73: Killed srun: error: Orion-22-57: tasks 111-147: Killed srun: error: Orion-22-60: tasks 221-256: Killed srun: error: Orion-22-59: tasks 185-220: Killed srun: error: Orion-22-58: tasks 148-184: Killed srun: error: Orion-22-56: tasks 74-110: Killed
Also version esmf/8.0.0 is not working either (I'll submit a ticket)
I am switching to the combination intel/openmpi esmf/8.1.1 to see what happens.
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 8:24 AM Panagiotis Velissariou - NOAA Affiliate < @.***> wrote:
esmf/8.0.0 is not working on orion (duf to: libesmf.so not being a shared library). esmf/8.1.1 has been compiled for intel/openmpi at this point. I will recompile the CoastalApp with working versions of esmf and re-submit the jobs.
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 7:57 AM Panagiotis Velissariou - NOAA Affiliate < @.***> wrote:
On orion, the job for sandy/hsofs250m started just a few hours ago. There was a typo in the adcirc.job slurm script (NEMS-pahm-adcirc.x instead of NEMS-pahm_adcirc.x) and that part failed. Test runs have been re-submitted (at 7:50am CT). Run failed due to: error while loading shared libraries: libesmf.so: cannot map zero-fill pages: Cannot allocate memory
due to esmf updates they did in the system?
I am investigating.
Zach is out today. Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 6, 2021 at 7:31 AM Saeed Moghimi @.***> wrote:
@zacharyburnettNOAA https://github.com/zacharyburnettNOAA @pvelissariou1 https://github.com/pvelissariou1 All, Any update on the run folders? -Saeed
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-894227241, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TP2WHVXY5C2NR3DTBTLT3PIYBANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
current run w/ GWCE solution scheme set to explicit
:
(coupledmodeldriver) Zachary.Burnett@hfe09 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210809_florence_multivariate_besttrack_120m_5members
> check_completion
{
"spinup": "running",
"runs": {
"vortex_4_variable_perturbation_4": "not_started",
"vortex_4_variable_perturbation_5": "not_started",
"vortex_4_variable_perturbation_1": "not_started",
"vortex_4_variable_perturbation_3": "not_started",
"original": "not_started",
"vortex_4_variable_perturbation_2": "not_started"
}
}
@wpringle @zacharyburnettNOAA
Here is what I remember
@zacharyburnettNOAA Please document this cases (inputs/jsons) for bit-by-bit reproducibility.
@saeed-moghimi-noaa @pvelissariou1 just FYI, I've added a script to coupledmodeldriver
that checks the status of an ADCIRC run directory and returns one of completed
, running
, failed
, errer
, not_started
I built it for my own use but figured that it might be helpful to you as well; all you need to do is activate the anaconda environment and then you have the check_completion
command in your path:
conda activate coupledmodeldriver
cd /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210810_florence_atmesh_120m
check_completion
check_completion --verbose
Status update from yesterday:
this run didn't like using the redist
method (even with the 120m ATMESH forcing), so I used bilinear
instead. This run did not have any errors, but overstepped the 6 hour time limit (and so DID NOT COMPLETE). I will try again with two runs: nearest_stod
and nearest_dtos
this run completed 100%:
(coupledmodeldriver) Zachary.Burnett@hfe08 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210810_florence_atmesh_250m
> du runs/unperturbed/fort.* -csh
154M runs/unperturbed/fort.13
0 runs/unperturbed/fort.14
132K runs/unperturbed/fort.15
4.0K runs/unperturbed/fort.16
9.9M runs/unperturbed/fort.61.nc
2.1G runs/unperturbed/fort.63.nc
4.5G runs/unperturbed/fort.64.nc
0 runs/unperturbed/fort.67.nc
0 runs/unperturbed/fort.68.nc
3.0G runs/unperturbed/fort.73.nc
7.0G runs/unperturbed/fort.74.nc
125M runs/unperturbed/fort.80
17G total
but had the following errors when stopping:
application called MPI_Abort(comm=0xC4000039, 0) - process 0
slurmstepd: error: *** STEP 21516200.0 ON h5c06 CANCELLED AT 2021-08-11T03:17:32 ***
srun: Job step aborted: Waiting up to 32 seconds for job step to finish.
srun: error: h15c15: task 279: Killed
srun: launch/slurm: _step_signal: Terminating StepId=21516200.0
srun: error: h15c14: task 239: Killed
srun: error: h18c27: task 834: Killed
srun: error: h18c41: task 914: Killed
srun: error: h22c40: task 1456: Killed
srun: error: h5c06: tasks 1-39: Killed
...
srun: error: h22c40: tasks 1455,1457-1493: Killed
srun: error: h15c15: tasks 240-278: Killed
srun: error: h15c14: tasks 200-238: Killed
srun: error: h23c50: tasks 1923-1961: Killed
srun: error: h15c17: tasks 280-319: Killed
srun: error: h22c38: tasks 1416-1454: Killed
srun: error: h23c02: tasks 1533-1571: Killed
srun: error: h23c14: tasks 1611-1649: Killed
srun: error: h18c27: tasks 831-833,835-869: Killed
srun: error: h5c06: task 0: Killed
@zacharyburnettNOAA Please share the location on Hera.
@zacharyburnettNOAA Please share the location on Hera.
Sure thing, here:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210810_florence_atmesh_250m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210810_florence_atmesh_120m
The 120 m seems not completed. Did you compare your inputs with Yuji ones? The 250 m look more promising.
I am currently running a run at the following:
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210812_florence_atmesh_120m_yujicopy_newnems
{
"spinup": "completed - 100.0%",
"runs": {
"unperturbed": "running - 0%"
}
}
I've run a copy of Yuji's run (using as many identical parameters as possible, but using the newly-built NEMS), and it seems to have completed:
(coupledmodeldriver) Zachary.Burnett@hfe07 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210812_florence_atmesh_120m_yujicopy_newnems
> check_completion
{
"spinup": "completed - 100.0%",
"runs": {
"unperturbed": "completed - 100.0%"
}
}
(coupledmodeldriver) Zachary.Burnett@hfe07 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210812_florence_atmesh_120m_yujicopy_newnems
> du runs/unperturbed/fort.* -csh
52M runs/unperturbed/fort.13
0 runs/unperturbed/fort.14
76K runs/unperturbed/fort.15
4.0K runs/unperturbed/fort.16
9.8G runs/unperturbed/fort.63.nc
21G runs/unperturbed/fort.64.nc
0 runs/unperturbed/fort.67.nc
0 runs/unperturbed/fort.68.nc
14G runs/unperturbed/fort.73.nc
22G runs/unperturbed/fort.74.nc
386M runs/unperturbed/fort.80
66G total
I've am currently copying it to the following directory:
/scratch2/COASTAL/coastal/save/NAMED_STORMS/Florence_ADCIRC/run_20210812_florence_atmesh_120m_yujicopy_newnems
here is the configuration for this 120m run:
configure_adcirc.json
{
"adcirc_executable_path": "../../../../repositories/CoastalApp_PaHM/ALLBIN_INSTALL/NEMS.x",
"adcprep_executable_path": "../../../../repositories/CoastalApp_PaHM/ALLBIN_INSTALL/adcprep",
"aswip_executable_path": null,
"modeled_start_time": "2018-09-13 06:00:00",
"modeled_end_time": "2018-10-03 18:00:00",
"modeled_timestep": 2.0,
"fort_13_path": "/scratch2/COASTAL/coastal/save/NAMED_STORMS/Florence_ADCIRC/hsofs120m/florence.atm2ocn.20210621.atmea120m/run/fort.13",
"fort_14_path": "/scratch2/COASTAL/coastal/save/NAMED_STORMS/Florence_ADCIRC/hsofs120m/florence.atm2ocn.20210621.atmea120m/run/fort.14",
"tidal_spinup_duration": "05:00:00:00",
"tidal_spinup_timestep": 2.0,
"source_filename": "../../../../repositories/CoastalApp_PaHM/modulefiles/envmodules_intel.hera",
"use_original_mesh": false,
"output_surface": true,
"surface_output_interval": "01:00:00",
"output_stations": false,
"stations_file_path": "../../../../models/meshes/hsofs/120m/v3.0_20210401/stations.txt",
"stations_output_interval": "00:10:00",
"output_spinup": true,
"output_elevations": true,
"output_velocities": true,
"output_concentrations": false,
"output_meteorological_factors": true,
"processors": 600,
"nems_parameters": {},
"attributes": {
...
"gwce_solution_scheme": "explicit",
...
"smagorinsky": -0.05,
...
"ICS": 20
}
}
configure_atmesh.json
{
"nws": 17,
"resource": "/scratch2/COASTAL/coastal/noscrub/Yuji.Funakoshi/nsem-workflow/data/com/atm/para/florence/Florence_HWRF_HRRR_EA120m.nc",
"processors": 1,
"nems_parameters": {},
"interval": 3600
}
configure_tidal.json
{
"tidal_source": "TPXO",
"constituents": [
"Q1",
"O1",
"P1",
"K1",
"N2",
"M2",
"S2",
"K2"
],
"resource": "../../../../models/forcings/tides/h_tpxo9.v1.nc"
}
configure_nems.json
{
"executable_path": "../../../../repositories/CoastalApp_PaHM/ALLBIN_INSTALL/NEMS.x",
"modeled_start_time": "2018-09-13 06:00:00",
"modeled_end_time": "2018-10-03 18:00:00",
"interval": "01:00:00",
"connections": [
"ATM -> OCN :remapMethod=redist"
],
"mediations": [],
"sequence": [
"ATM -> OCN",
"ATM",
"OCN"
]
}
I'll work toward making these absolute paths in the example run.
@zacharyburnettNOAA
Please make a diff between the run folder that you had and the one Takis made. See where the differences are. We want to see: which of these settings is more computationally efficient? Why your previous ones did not get completed?
Thanks Saeed. Zach, besides the config files please check the #of cores you are using as well. Is actually your spinup duration 5 hrs?
On orion for sandy i got ~1 hr (7days spinup) and ~ 6 hrs for the simulation (22 days)
Takis
Panagiotis Velissariou, Ph.D., P.E. UCAR Scientist National Ocean and Atmospheric Administration National Ocean Service Office of Coast Survey CSDL/CMMB Project Lead - Coastal Coupling USM - Stennis Space Center cell: (205) 227-9141 email: @.***
On Fri, Aug 13, 2021 at 9:42 AM Saeed Moghimi @.***> wrote:
@zacharyburnettNOAA https://github.com/zacharyburnettNOAA
Please make a diff between the run folder that you had and the one Takis made. See where the differences are. We want to see: which of these settings is more computationally efficient? Why your previous ones did not get completed?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/noaa-ocs-modeling/CoastalApp/issues/78#issuecomment-898506223, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TPZT47SGUEUQWNP6FGTT4UVORANCNFSM5BGTF6UA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .
TODO:
Hi all,
Two issues have come up while testing the latest version of the CoastalApp repo on Hera, using the Florence 250 m HSOFS case:
Error #1: When cloning the code with
$git clone https://github.com/noaa-ocs-modeling/CoastalApp -b feature/pahm --recursive
I get the error that "This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access." (see below). Do others get the same message?
...
Submodule path 'ADCIRC': checked out '49debf8b84356256c72bb9b4e5501bfe4ac5234a'
Submodule path 'ATMESH': checked out '7aab919a12b165cf9a65498d566575cd86659cbc'
Submodule path 'NEMS': checked out 'dc8b2de497031eb894d21712af9c139ef9c67c3f'
Submodule 'tests/produtil/NCEPLIBS-pyprodutil' (https://github.com/NOAA-EMC/NCEPLIBS-pyprodutil) registered for path 'NEMS/tests/produtil/NCEPLIBS-pyprodutil'
Cloning into '/scratch2/COASTAL/coastal/save/Andre.VanderWesthuysen/OCS-NEMS3/CoastalApp/NEMS/tests/produtil/NCEPLIBS-pyprodutil'...
remote: Enumerating objects: 345, done.
remote: Counting objects: 100% (9/9), done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 345 (delta 6), reused 2 (delta 2), pack-reused 336
Receiving objects: 100% (345/345), 300.61 KiB | 6.83 MiB/s, done.
Resolving deltas: 100% (220/220), done.
Submodule path 'NEMS/tests/produtil/NCEPLIBS-pyprodutil': checked out 'ca171b95095db4fcd0fc7b01c23d073d90becd99'
Submodule path 'NWM': checked out '3bc401d298070515cb6171a585d2d19646afd650'
Downloading doc/formats/hurdat2-format.pdf (421 KB)
Error downloading object: doc/formats/hurdat2-format.pdf (34c48b5): Smudge error: Error downloading doc/formats/hurdat2-format.pdf (34c48b5a916768fdc8d3b0b509d046013c3c7d50eb66694baa4a0f0eb8da8eb9): batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access.
Errors logged to /scratch2/COASTAL/coastal/save/Andre.VanderWesthuysen/OCS-NEMS3/CoastalApp/.git/modules/PAHM/lfs/logs/20210823T114418.181796877.log
Use git lfs logs last
to view the log.
error: external filter 'git-lfs filter-process' failed
fatal: doc/formats/hurdat2-format.pdf: smudge filter lfs failed
Submodule path 'WW3': checked out '9726c8b6757b3578493b4ed1653abc90a76fcf0d'
Submodule path 'WW3DATA': checked out 'beda5f2eab6bf670991ad970fc064e0a80584f0a'
Unable to checkout '6212f2d1905ff3c080f1358b82c43b83fface148' in submodule path 'PAHM'
Error #2: After compiling the NEMS.x with the options:
$./build.sh --component "ADCIRC ATMESH WW3" --plat hera --compiler intel --clean -2
and running the NEMS.x for Florence 250 HSOFS, I get the following error in PE*/fort.16 that the rads.64.nc file was not created by adcprep (see details below). The fort.15 has the input:
517 ! NWS - WIND STRESS AND BAROMETRIC PRESSURE OPTION PARAMETER
So this file has always been generated in previous ACIRC versions. Again, did anyone else run into this?
... A NEW GLOBAL ELEVATION FILE WILL BE STARTED
0 LINES OR RECORDS WRITTEN IN THE GLOBAL VELOCITY FILE BY THE TIME OF THE HOT START
SPOOL COUNTER = 1799
A NEW GLOBAL VELOCITY FILE WILL BE STARTED
0 LINES OR RECORDS WRITTEN IN THE GLOBAL CONCENTRATION FILE BY THE TIME OF THE HOT START
SPOOL COUNTER = 0
0 LINES OR RECORDS WRITTEN IN THE GLOBAL WIND FILE BY THE TIME OF THE HOT START
0LINES OR RECORDS WRITTEN IN THE GLOBAL PRESSURE FILE BY THE TIME OF THE HOT START
SPOOL COUNTER = 0
NEW GLOBAL WIND & pressure FILEs WILL BE STARTED
NEW GLOBAL WIND & pressure FILEs WILL BE STARTED ERROR: createNetCDFOutputFile: The NetCDF output file './rads.64.nc' was not found. It should have been created by adcprep. ERROR: initOutput2D: There was an issue with NetCDF file initialization. INFO: terminate: ADCIRC Terminating.
All,
I have found and resolved the issues that prevented the ATMESH-WW3-ADCIRC (atm2wav2ocn) runs from succeeding. The fist, as discussed last week, is a condition for NWS=517 that got dropped along the way, causing the file rads.64.nc not to be created. On Saeed's suggestion, this was resolved with:
$ git diff prep.F
diff --git a/prep/prep.F b/prep/prep.F
index 5414a37..134f8f0 100644
--- a/prep/prep.F
+++ b/prep/prep.F
@@ -7658,7 +7658,7 @@ C unfortunate cut-and-paste duplication unnecessary.
type(OutputDataDescript_t), SAVE :: dynamicWaterlevelCorrectionStaDescript ! dynamicWaterlevelCorrection.61
! tcm v50.75 moved RSDescript outside of the ifdef adcswan for use with
-! nrs = 3 or nrs= 4
+! nrs = 3 or nrs= 4 or nrs = 5
type(OutputDataDescript_t), SAVE :: RSDescript
#ifdef ADCSWAN
@@ -8076,7 +8076,7 @@ C maxwvel.63
RSMaxDescript % readMaxMin = .true.
call makeFileName(RSMaxDescript)
-! tcm v50.75 removed ifdef adcswan to allow for use whenever nrs=3 or nrs=4
+! tcm v50.75 removed ifdef adcswan to allow for use whenever nrs=3 or nrs=4 or nrs=5
!#ifdef ADCSWAN
Cobell 20120510: SWAN Output Data
C........Radiation Stress
@@ -8843,9 +8843,9 @@ C was specified.
endif
!
! tcm v50.75 moved ifdef adcswan below RSDescript only to allow
- ! for use whenever nrs=3 or nrs=4
+ ! for use whenever nrs=3 or nrs=4 or nrs=5
! Cobell 20120510: Added for SWAN NetCDF
- IF ((NRS.EQ.3).OR.(NRS.EQ.4)) THEN
+ IF ((NRS.EQ.3).OR.(NRS.EQ.4).OR.(NRS.EQ.5)) THEN
CALL initNetCDFOutputFile(RSDescript,reterr)
ENDIF
! tcm v50.75 moved ifdef adcswan to here
Once this was resolved, WW3 and ADCIRC was still not connecting the radiation stress exchange. Turns out that this was due to an old inconsistency in the naming of the radiation stress fields which slipped back into the ADCIRC cap code. These standard field names already exist in the NUOPC Field Dictionary as "eastward_wave_radiationstress", "northwardwave_radiation_stress", "eastwardnorthwardwave_radiation_stress", and do not need to be created as new dictionary entries. The fix is as follows, and should be reintroduced into the ADCIRC cap:
$ git diff adc_cap.F90
diff --git a/thirdparty/nuopc/adc_cap.F90 b/thirdparty/nuopc/adc_cap.F90
index 399fef1..876b9b3 100644
--- a/thirdparty/nuopc/adc_cap.F90
+++ b/thirdparty/nuopc/adc_cap.F90
@@ -123,14 +123,14 @@
!!
!! Standard Name |Short Name | Units | Model Variable | File | Description | Notes
!! ----------------------------|-----------|----------------------|-----------------|--------------|----------------------------|-----------------
-!! eastward_radiation_stress |sxx | N.m^-2/rho -> m2s-2 | ADCIRC_SXX | global.F | |
-!! northward_radiation_stress |syy | N.m^-2/rho -> m2s-2 | ADCIRC_SXY | global.F | |
-!! cross_radiation_stress |sxy | N.m^-2/rho -> m2s-2 | ADCIRC_SXY | global.F | |
+!! eastward_wave_radiation_stress |sxx | N.m^-2/rho -> m2s-2 | ADCIRC_SXX | global.F | |
+!! northward_wave_radiation_stress |syy | N.m^-2/rho -> m2s-2 | ADCIRC_SXY | global.F | |
+!! eastward_northward_wave_radiation_stress |sxy | N.m^-2/rho -> m2s-2 | ADCIRC_SXY | global.F | |
!!
!!expFieldName expFieldStdName
-!!sxx eastward_radiation_stress
-!!syy northward_radiation_stress
-!!sxy cross_radiation_stress
+!!sxx eastward_wave_radiation_stress
+!!syy northward_wave_radiation_stress
+!!sxy eastward_northward_wave_radiation_stress
!!ADCIRC accepts wave-driven stresses "in units of velocity squared
!! (consistent with the units of gravity). Stress in these units is obtained
!! by dividing stress in units of force/area by the reference density of water."
@@ -465,27 +465,27 @@ module adc_cap
!--------- import fields to Sea Adc -------------
!TODO: Consider moving these lines to driver to avoid doing it in both CAPS
- call NUOPC_FieldDictionaryAddEntry("eastward_radiation_stress", "mx", rc=rc)
- if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &
- line=__LINE__, &
- file=__FILE__)) &
- return ! bail out
-
- call NUOPC_FieldDictionaryAddEntry("northward_radiation_stress", "mx", rc=rc)
- if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &
- line=__LINE__, &
- file=__FILE__)) &
- return ! bail out
-
- call NUOPC_FieldDictionaryAddEntry("cross_radiation_stress", "mx", rc=rc)
- if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &
- line=__LINE__, &
- file=__FILE__)) &
- return ! bail out
+! call NUOPC_FieldDictionaryAddEntry("eastward_radiation_stress", "mx", rc=rc)
+! if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &
+! line=__LINE__, &
+! file=__FILE__)) &
+! return ! bail out
+!
+! call NUOPC_FieldDictionaryAddEntry("northward_radiation_stress", "mx", rc=rc)
+! if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &
+! line=__LINE__, &
+! file=__FILE__)) &
+! return ! bail out
+!
+! call NUOPC_FieldDictionaryAddEntry("cross_radiation_stress", "mx", rc=rc)
+! if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &
+! line=__LINE__, &
+! file=__FILE__)) &
+! return ! bail out
- call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname="eastward_radiation_stress", shortname= "sxx")
- call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname="northward_radiation_stress",shortname= "syy")
- call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname="cross_radiation_stress", shortname= "sxy")
+ call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname="eastward_wave_radiation_stress", shortname= "sxx")
+ call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname="northward_wave_radiation_stress", shortname= "syy")
+ call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname="eastward_northward_wave_radiation_stress", shortname= "sxy")
!--------- import fields from atm to Adc -------------
call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname= "air_pressure_at_sea_level", shortname= "pmsl" )
call fld_list_add(num=fldsToAdc_num, fldlist=fldsToAdc, stdname= "inst_merid_wind_height10m", shortname= "imwh10m" )
@awest-noaa Excellent! I hope you don't mind, I edited your comment to add diff
to make the differences colored and easier to read. Let me know if you want to change it back.