Closed saeed-moghimi-noaa closed 1 month ago
Hi Saeideh and Ali,@sbanihash @AliS-Noaa
I have already been granted to access the coastal project on HPSS from here, could you please let me know how to get the GFS data and the script to process it?
Thank you very much!
Hello Yunfang,
Here is the path on HERA where you can find the bash script: /scratch2/NCEPDEV/marine/alisalimi/SWELL-WESTCOAST/GFS-WAVE-2022/dl-hpss.sh
The path you can find the gfs data is also in the bash script. Cheers,
@AliS-Noaa Hi Ali,
Thank you very much! I have got the script, it seems that I also need to apply access to the folder /NCEPPROD/hpssprod/, as I can only get access to the /STI/coastal at this moment.
Hi Ali @AliS-Noaa ,
I have retrieved the GFS spectra data from HPSS, those files are ASCII files, do you have any scripts to process those files to make netcdf B.C. file?
Thank you very much!
Hi Yunfang,
There are grib2 files which have gridded data. You can export the netcdf files using wgrib2 for your usage. Please let me know if you need me to show you how to do it. Also, the ww3 can process ascii boundary files as well. ww3_bound will process ascii boundary data while ww3_bounc will process netcdf files.
Cheers, Ali
@AliS-Noaa Hi Ali. Do you have any scripts at your disposal to read the ASCII spec files to generate a single spec file (ASCII or NetCDF) that contain the hourly data per day? Maybe python/matlab scripts? This will be very helpful to us.
Yunfang,
You should be able to use some of these scripts that @aliabdolali created to handle spectral data. https://github.com/NOAA-EMC/WW3-tools/tree/develop/matlab_tools
There are scripts to read and write spectral ww3 data. Ali
Hi Ali,
The files I downloaded are located at /scratch2/STI/coastal/Yunfang.Sun/HPSS on Hera, there are no grib2 files, only ASCII files. Could you please help me to check if I am using the correct files in that folder? I was using /scratch2/STI/coastal/Yunfang.Sun/HPSS/dl-hpss.sh to retrieve the files.
Thank you very much!
Best,
Yunfang
Hello Yunfang,
For example if you do htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar you can see all the grib2 file.
We can have a meeting if you want and I will show you how to get them and generate the netcdf files from grib2 files.
Cheers, -------------------------------------------------------- Ali Salimi-Tarazouj, Ph.D. Physical Scientist, Coastal Engineer Lynker at NOAA/NWS/NCEP/EMC 5830 University Research Court College Park, Maryland, 20740 Office: (202) 964-0965 Mobile: (302) 588-5505
On Fri, Dec 15, 2023 at 3:43 PM Yunfang Sun @.***> wrote:
Hi Ali,
The files I downloaded are located at /scratch2/STI/coastal/Yunfang.Sun/HPSS on Hera, there are no grib2 files, only ASCII files. Could you please help me to check if I am using the correct files in that folder? I was using /scratch2/STI/coastal/Yunfang.Sun/HPSS/dl-hpss.sh to retrieve the files.
Thank you very much!
Best,
Yunfang
— Reply to this email directly, view it on GitHub https://github.com/oceanmodeling/ufs-coastal/issues/24#issuecomment-1858462775, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4J7D7RD7BGJQF7KZVP4QJDYJSY7XAVCNFSM6AAAAABAQG6ODGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNJYGQ3DENZXGU . You are receiving this because you were mentioned.Message ID: @.***>
Hi Ali,
Which module should I load to use "htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar"?
As I got the following error:
[hpsscore1]/NCEPPROD->htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar
*** unrecognized command: 'htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar'
[hpsscore1]/NCEPPROD->
Thank you!
Yunfang
Yunfang,
Please use: module load hpss
Cheers, Ali -------------------------------------------------------- Ali Salimi-Tarazouj, Ph.D. Physical Scientist, Coastal Engineer Lynker at NOAA/NWS/NCEP/EMC 5830 University Research Court College Park, Maryland, 20740 Office: (202) 964-0965 Mobile: (302) 588-5505
On Mon, Dec 18, 2023 at 11:29 AM Yunfang Sun @.***> wrote:
Hi Ali,
Which module should I load to use "htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar"?
As I got the following error:
[hpsscore1]/NCEPPROD->htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar *** unrecognized command: 'htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar' [hpsscore1]/NCEPPROD->
Thank you!
Yunfang
— Reply to this email directly, view it on GitHub https://github.com/oceanmodeling/ufs-coastal/issues/24#issuecomment-1860971789, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4J7D7U4WG52OVKWCFXQMA3YKBVOZAVCNFSM6AAAAABAQG6ODGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRQHE3TCNZYHE . You are receiving this because you were mentioned.Message ID: @.***>
Hi Ali,
I have loaded hpss by using module load hpss
, and usedhsi
to enter the HPSS system, and it gives me the following error
[hpsscore1]/NCEPPROD->htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar
*** unrecognized command: 'htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar'
[hpsscore1]/NCEPPROD->
Do you have a clue, or do I need to contact the HPSS help?
Thank you!
Yunfang
Yunfang, Please use: module load hpss Cheers, Ali -------------------------------------------------------- Ali Salimi-Tarazouj, Ph.D. Physical Scientist, Coastal Engineer Lynker at NOAA/NWS/NCEP/EMC 5830 University Research Court College Park, Maryland, 20740 Office: (202) 964-0965 Mobile: (302) 588-5505 … On Mon, Dec 18, 2023 at 11:29 AM Yunfang Sun @.> wrote: Hi Ali, Which module should I load to use "htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar"? As I got the following error: [hpsscore1]/NCEPPROD->htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar unrecognized command: 'htar -tvf /NCEPPROD/hpssprod/runhistory/rh2022/202212/20221212/com_gfs_v16.3_gfs.20221212_18.gfs.tar' [hpsscore1]/NCEPPROD-> Thank you! Yunfang — Reply to this email directly, view it on GitHub <#24 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4J7D7U4WG52OVKWCFXQMA3YKBVOZAVCNFSM6AAAAABAQG6ODGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRQHE3TCNZYHE . You are receiving this because you were mentioned.Message ID: @.***>
@AliS-Noaa Thanks for your help.
From what i recall, you need to have hpss access from your FOM otherwise you cannot access hpss
I believe Yunfang has already granted access to HPSS.
Panagiotis Velissariou, Ph.D., P.E. Scientist III OAI at the Office of Coast Survey CSDL/CMMB National Ocean Service National Ocean and Atmospheric Administration cell: (205) 227-9141 email: @.***
On Mon, Dec 18, 2023 at 11:03 AM Ali.Abdolali @.***> wrote:
From what i recall, you need to have hpss access from your FOM otherwise you cannot access hpss
— Reply to this email directly, view it on GitHub https://github.com/oceanmodeling/ufs-coastal/issues/24#issuecomment-1861069504, or unsubscribe https://github.com/notifications/unsubscribe-auth/APC7TP66UQZYCXRITO4GWFTYKBZNNAVCNFSM6AAAAABAQG6ODGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRRGA3DSNJQGQ . You are receiving this because you were assigned.Message ID: @.***>
Hi @aliabdolali ,
Thank you for your reply, I figured out that the command should be used on Hera, not inside the HPSS.
Best,
Yunfang
Hi @aliabdolali @AliS-Noaa, I want to set the WW3 with only surface wind forcing and without open boundaries for the Atlantic domain, could I know which case in the regression tests could be an example for me to make my case? Thank you very much! Best, Yunfang
Hi @aliabdolali @AliS-Noaa, I want to set the WW3 with only surface wind forcing and without open boundaries for the Atlantic domain, could I know which case in the regression tests could be an example for me to make my case? Thank you very much! Best, Yunfang
if you do not execute ww3_bound or ww3_bounc, it does not generate nest.ww3 file and therefore the model does not use boundary condition.
Hi Ali @aliabdolali ,
Thank you for your reply.
When I try to use the Global ERA5 wind forcing, I can't produce the wind.ww3, in my ww3_prnc.inp, the setting is as follows:
$
'WND' 'LL' T T
$
$ Name of dimensions ------------------------------------------------- $
$
$ longitude latitude time
time, latitude, longitude
$
$ Variables to use --------------------------------------------------- $
$
u10 v10
$ uwnd vwnd
$zeta
$u-vel v-vel
$
$ Additional time input ---------------------------------------------- $
$ If time flag is .FALSE., give time of field in yyyymmdd hhmmss format.
$
$ 19680606 053000
$
$ Define data files -------------------------------------------------- $
$ The input line identifies the filename using for the forcing field.
download_inv.nc
$
$ -------------------------------------------------------------------- $
$ End of input file $
$ -------------------------------------------------------------------- $
And the download_inv.nc file looks like this:
ncdump -h download_inv.nc
netcdf download_inv {
dimensions:
time = UNLIMITED ; // (1464 currently)
longitude = 1440 ;
latitude = 721 ;
variables:
int time(time) ;
time:standard_name = "time" ;
time:long_name = "time" ;
time:units = "hours since 1900-01-01 00:00:00.0" ;
time:calendar = "gregorian" ;
time:axis = "T" ;
float longitude(longitude) ;
longitude:standard_name = "longitude" ;
longitude:long_name = "longitude" ;
longitude:units = "degrees_east" ;
longitude:axis = "X" ;
float latitude(latitude) ;
latitude:standard_name = "latitude" ;
latitude:long_name = "latitude" ;
latitude:units = "degrees_north" ;
latitude:axis = "Y" ;
short u10(time, latitude, longitude) ;
u10:long_name = "10 metre U wind component" ;
u10:units = "m s**-1" ;
u10:add_offset = -0.692859959977449 ;
u10:scale_factor = 0.00110688284552248 ;
u10:_FillValue = -32767s ;
u10:missing_value = -32767s ;
short v10(time, latitude, longitude) ;
v10:long_name = "10 metre V wind component" ;
v10:units = "m s**-1" ;
v10:add_offset = 3.74782109577442 ;
v10:scale_factor = 0.00106191001366916 ;
v10:_FillValue = -32767s ;
v10:missing_value = -32767s ;
short d2m(time, latitude, longitude) ;
d2m:long_name = "2 metre dewpoint temperature" ;
d2m:units = "K" ;
d2m:add_offset = 247.993123435418 ;
d2m:scale_factor = 0.00177497974951253 ;
d2m:_FillValue = -32767s ;
d2m:missing_value = -32767s ;
short msl(time, latitude, longitude) ;
msl:standard_name = "air_pressure_at_mean_sea_level" ;
msl:long_name = "Mean sea level pressure" ;
msl:units = "Pa" ;
msl:add_offset = 98586.6813332596 ;
msl:scale_factor = 0.262333480841713 ;
msl:_FillValue = -32767s ;
msl:missing_value = -32767s ;
// global attributes:
:CDI = "Climate Data Interface version 1.9.10 (https://mpimet.mpg.de/cdi)" ;
:Conventions = "CF-1.6" ;
:history = "Mon Sep 18 10:02:45 2023: cdo invertlat download.nc download_inv.nc\n",
"2023-09-18 13:34:14 GMT by grib_to_netcdf-2.25.1: /opt/ecmwf/mars-client/bin/grib_to_netcdf.bin -S param -o /cache/data6/adaptor.mars.internal-1695043598.8457577-27378-3-fe079c82-8c62-4e80-8ae1-47406b4a76ff.nc /cache/tmp/fe079c82-8c62-4e80-8ae1-47406b4a76ff-adaptor.mars.internal-1695043353.1076667-27378-3-tmp.grib" ;
:CDO = "Climate Data Operators version 1.9.10 (https://mpimet.mpg.de/cdo)" ;
}
When I run the ww3_prnc, it gives me the following error:
Abort(1090575) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Other MPI error, error stack:
MPIR_Init_thread(143):
MPID_Init(1221)......:
MPIR_pmi_init(168)...: PMI2_Job_GetId returned 14
Abort(1090575) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Other MPI error, error stack:
MPIR_Init_thread(143):
MPID_Init(1221)......:
MPIR_pmi_init(168)...: PMI2_Job_GetId returned 14
Abort(1090575) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Other MPI error, error stack:
MPIR_Init_thread(143):
MPID_Init(1221)......:
MPIR_pmi_init(168)...: PMI2_Job_GetId returned 14
Abort(1090575) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Other MPI error, error stack:
MPIR_Init_thread(143):
MPID_Init(1221)......:
MPIR_pmi_init(168)...: PMI2_Job_GetId returned 14
Abort(1090575) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Init: Other MPI error, error stack:
MPIR_Init_thread(143):
MPID_Init(1221)......:
MPIR_pmi_init(168)...: PMI2_Job_GetId returned 14
The switch I am using is as follows: NCO PDLIB SCOTCH NOGRB DIST MPI PR3 UQ FLX0 SEED ST4 STAB0 NL1 BT1 DB1 MLIM FLD2 TR0 BS0 RWND WNX1 WNT1 CRX1 CRT1 O0 O1 O2 O3 O4 O5 O6 O7 O14 O15 IC0 IS0 REF0 SCRIP SCRIPNC
, and the files are located at /scratch2/STI/coastal/Yunfang.Sun/ww3_hera/ian on Hera.
Could you please give me any suggestions to fix it?
Thank you again!
Best,
Yunfang
@yunfangsun You need to run ww3_prnc (and other stanalone ww3 programs) through slurm scheduler and srun. It is the same issue we discussed a few days ago when generating rge ESMFMesh files. Please check any of the setup jobs in the run folder of the ww3 tests in CoastalApp-testsuite. Could you please try this?
Hi Takis, @pvelissariou1,
I was using slurm and srun, and it seems that this error started from reading the netcdf file
Description of inputs
--------------------------------------------------
Input type : winds
Format type : long.-lat. grid
Field conserves velocity.
File name : download_inv.nc
Dimension along x : time
Dimension along y : latitude
Field component 1 : u10
Field component 2 : v10
Abort(59) on node 1 (rank 1 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1
Abort(59) on node 3 (rank 3 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 3
Abort(59) on node 5 (rank 5 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 5
It mixed the dimension of time and longitude and caused the error
@yunfangsun I see that the NetCDF file might have an issue. Is it on hera?
Hi Takis,
It is /scratch2/STI/coastal/Yunfang.Sun/ww3_hera/ian/download_inv.nc on Hera.
Thank you!
@yunfangsun you forgot to put time on the Name of the dimensions
Hi Ali @AliS-Noaa ,
My run is from Sep 01, 2022, to Oct 15, 2022, and the forcing file includes the whole month of September and October of 2022.
And I previously set
"$ Name of dimensions ---------------------------------------------
time latitude longitude"
but it gives File name : download_inv.nc Dimension along x : time Dimension along y : latitude
Therefore, I removed the time dimension in ww3_prnc.inp to try to see if there are any differences. It seems that putting a time dimension or not didn't help.
Hi @AliS-Noaa The error message shows
*** WAVEWATCH III ERROR IN PRNC :
LINE NUMBER 869
NETCDF ERROR MESSAGE:
*** WAVEWATCH III ERROR IN PRNC :
LINE NUMBER 869
NETCDF ERROR MESSAGE:
*** WAVEWATCH III ERROR IN PRNC :
LINE NUMBER 869
NETCDF ERROR MESSAGE:
*** WAVEWATCH III ERROR IN PRNC :
LINE NUMBER 869
NETCDF ERROR MESSAGE:
*** WAVEWATCH III ERROR IN PRNC :
LINE NUMBER 869
NETCDF ERROR MESSAGE:
*** WAVEWATCH III ERROR IN PRNC :
LINE NUMBER 869
NETCDF ERROR MESSAGE:
NetCDF: Index exceeds dimension bound
NetCDF: Index exceeds dimension bound
And in ww3_prnc.F90 and around Ln 869 is
! Get longitude and latitude
IF (ITYPE.NE.1.AND.ITYPE.NE.6) THEN
ALLOCATE (ALA(NXI,NYI))
ALLOCATE (ALO(NXI,NYI))
! get longitude
IRET=NF90_INQ_VARID(NCID,"longitude",VARIDTMP)
IF ( IRET/=NF90_NOERR ) IRET=NF90_INQ_VARID(NCID,"lon",VARIDTMP)
IF ( IRET/=NF90_NOERR ) IRET=NF90_INQ_VARID(NCID,"Longitude",VARIDTMP)
IF ( IRET/=NF90_NOERR ) IRET=NF90_INQ_VARID(NCID,"x",VARIDTMP)
IF ( IRET/=NF90_NOERR ) IRET=NF90_INQ_VARID(NCID,"X",VARIDTMP)
IRET = NF90_INQUIRE_VARIABLE(NCID, VARIDTMP, ndims = NUMDIMS)
call CHECK_ERR(IRET)
IF (NUMDIMS.EQ.1) THEN
IRET=NF90_GET_VAR(NCID,VARIDTMP,X0I,start=(/1/))
call CHECK_ERR(IRET)
IRET=NF90_GET_VAR(NCID,VARIDTMP,XNI,start=(/NXI/))
call CHECK_ERR(IRET)
IRET=NF90_GET_VAR(NCID,VARIDTMP,ALO(:,1))
call CHECK_ERR(IRET)
DO i=1,NYI
ALO(:,i)=ALO(:,1)
END DO
And it breaks at call CHECK_ERR(IRET)
guys, is your netcdf file latlon gridded format or on an unstr mesh? It differs in WW3, so you need to be careful.
@aliabdolali Hi Ali,
My netcdf file is latlon gridded format, which is directly downloaded from ERA5's global results, and for $ Mayor types of field and time flag
, I am using 'WND' 'LL' T T
.
The header of the netcdf is as follows:
netcdf download_inv {
dimensions:
time = UNLIMITED ; // (1464 currently)
longitude = 1440 ;
latitude = 721 ;
variables:
int time(time) ;
time:standard_name = "time" ;
time:long_name = "time" ;
time:units = "hours since 1900-01-01 00:00:00.0" ;
time:calendar = "gregorian" ;
time:axis = "T" ;
float longitude(longitude) ;
longitude:standard_name = "longitude" ;
longitude:long_name = "longitude" ;
longitude:units = "degrees_east" ;
longitude:axis = "X" ;
float latitude(latitude) ;
latitude:standard_name = "latitude" ;
latitude:long_name = "latitude" ;
latitude:units = "degrees_north" ;
latitude:axis = "Y" ;
short u10(time, latitude, longitude) ;
u10:long_name = "10 metre U wind component" ;
u10:units = "m s**-1" ;
u10:add_offset = -0.692859959977449 ;
u10:scale_factor = 0.00110688284552248 ;
u10:_FillValue = -32767s ;
u10:missing_value = -32767s ;
Do you have suggestions on how to fix it? Thank you!
@yunfangsun Is it possible to put some print statements before line 869 like: print , 'NXI = ', NXI and print , 'NYI = ', NYI and recompile? Does WW3 get the proper dimension values? I checked the NetCDF file and it looks ok to me.
could you share your ww3_prnc.nml(inp)?
@aliabdolali , Hi Ali, the ww3_prnc.inp is as follows:
$ -------------------------------------------------------------------- $
$ WAVEWATCH III Field preprocessor input file $
$ -------------------------------------------------------------------- $
$ Mayor types of field and time flag
$ Field types : ICE Ice concentrations.
$ LEV Water levels.
$ WND Winds.
$ WNS Winds (including air-sea temp. dif.)
$ CUR Currents.
$ DAT Data for assimilation.
$
$ Format types : AI Transfer field 'as is'. (ITYPE 1)
$ LL Field defined on regular longitude-latitude
$ or Cartesian grid. (ITYPE 2)
$ Format types : AT Transfer field 'as is', performs tidal
$ analysis on the time series (ITYPE 6)
$ When using AT, another line should be added
$ with the choice ot tidal constituents:
$ ALL or FAST or VFAST or a list: e.g. 'M2 S2'
$
$ - Format type not used for field type 'DAT'.
$
$ Time flag : If true, time is included in file.
$ Header flag : If true, header is added to file.
$ (necessary for reading, FALSE is used only for
$ incremental generation of a data file.)
$
'WND' 'LL' T T
$ 'LEV' 'AI' T T
$ 'CUR' 'AI' T T
$
$ Name of dimensions ------------------------------------------------- $
$
$ longitude latitude time
time latitude longitude
$
$ Variables to use --------------------------------------------------- $
$
u10 v10
$ uwnd vwnd
$zeta
$u-vel v-vel
$
$ Additional time input ---------------------------------------------- $
$ If time flag is .FALSE., give time of field in yyyymmdd hhmmss format.
$
$ 19680606 053000
$
$ Define data files -------------------------------------------------- $
$ The input line identifies the filename using for the forcing field.
download_inv.nc
$'wind_atm_fin_ch_time_vec.nc'
$
$
$ -------------------------------------------------------------------- $
$ End of input file $
$ -------------------------------------------------------------------- $
change time latitude longitude
to longitude latitude time
and try again
Hi Ali @aliabdolali ,
It works after I changed it to longitude latitude time
.
Thank you!
glad to hear it worked for you. WW3 is designed with care about this details, to make sure the user does not introduce errors by mistake.
Hi Ali @aliabdolali, Thank you! After I interpolated the wind forcing, and ran the ww3_multi, the job break after "Initializing wave model ..." The error message is :
Error with istat= 9
Before allocation of MDATAS % SEA_IPGL, SEA_IPGL_TO_PROC : IMOD= 1
NSEA= 2470094
Before allocation of MDATAS % SEA_IPGL, SEA_IPGL_TO_PROC : IMOD= 1
Before allocation of MDATAS % SEA_IPGL, SEA_IPGL_TO_PROC : IMOD= 1
NSEA= 2470094
NSEA= 2470094
EXTCDE MPI_ABORT, IEXIT= 99
EXTCDE MSG=ALLOCATE FAILED
EXTCDE FILE=w3adatmd.F90
EXTCDE LINE= 1337
EXTCDE MPI_ABORT, IEXIT= 99
EXTCDE MSG=ALLOCATE FAILED
EXTCDE FILE=w3adatmd.F90
EXTCDE LINE= 1337
EXTCDE MPI_ABORT, IEXIT= 99
EXTCDE MSG=ALLOCATE FAILED
EXTCDE FILE=w3adatmd.F90
EXTCDE LINE= 1337
EXTCDE MPI_ABORT, IEXIT= 99
EXTCDE MSG=ALLOCATE FAILED
EXTCDE FILE=w3adatmd.F90
EXTCDE LINE= 1337
EXTCDE MPI_ABORT, IEXIT= 99
EXTCDE MSG=ALLOCATE FAILED
EXTCDE FILE=w3adatmd.F90
EXTCDE LINE= 1337
Before allocation of MDATAS % SEA_IPGL, SEA_IPGL_TO_PROC : IMOD= 1
Before allocation of MDATAS % SEA_IPGL, SEA_IPGL_TO_PROC : IMOD= 1
Before allocation of MDATAS % SEA_IPGL, SEA_IPGL_TO_PROC : IMOD= 1
NSEA= 2470094
NSEA= 2470094
NSEA= 2470094
Could you please suggest where I should check to correct the case?
Thank you again!
Hi Ali @AliS-Noaa ,
I am now testing the WW3 stand-alone case, could I know whether both ww3_shel.inp and ww3_shel.nml are essential for the unstructured case? or I only need to use one of them?
Thank you!
Yunfang
@AliS-Noaa Ali hi, we have run ww3_multi (for parallel jobs) with unstructured mesh many times before (standalone cases or coupled) with no issues. Has anything changed in the recent versions of WW3? We run ww3_multi on large domains for parallel subgrid calculations. On small domains (when running as a serial job) we run ww3_shel (e.g., Shinnecock inlet simulations in UFS-Coastal). Is this a correct approach?
Yunfang,
Only the .inp is sufficient.
Cheers, Ali S
-------------------------------------------------------- Ali Salimi-Tarazouj, Ph.D. Physical Scientist, Coastal Engineer Lynker at NOAA/NWS/NCEP/EMC 5830 University Research Court College Park, Maryland, 20740 Office: (202) 964-0965 Mobile: (302) 588-5505
On Wed, Dec 20, 2023 at 8:00 PM Yunfang Sun @.***> wrote:
Hi Ali @AliS-Noaa https://github.com/AliS-Noaa ,
I am now testing the WW3 stand-alone case, could I know whether both ww3_shel.inp and ww3_shel.nml are essential for the unstructured case? or I only need to use one of them?
Thank you!
Yunfang
— Reply to this email directly, view it on GitHub https://github.com/oceanmodeling/ufs-coastal/issues/24#issuecomment-1865332674, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4J7D7XJTP5LY2VA2U3PPY3YKOC27AVCNFSM6AAAAABAQG6ODGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRVGMZTENRXGQ . You are receiving this because you were mentioned.Message ID: @.***>
.The multi is fine and you can run the unstructured with the multi. Please ignore that comment.
Cheers, Ali S
-------------------------------------------------------- Ali Salimi-Tarazouj, Ph.D. Physical Scientist, Coastal Engineer Lynker at NOAA/NWS/NCEP/EMC 5830 University Research Court College Park, Maryland, 20740 Office: (202) 964-0965 Mobile: (302) 588-5505
On Wed, Dec 20, 2023 at 8:30 PM Panagiotis Velissariou < @.***> wrote:
@AliS-Noaa https://github.com/AliS-Noaa Ali hi, we have run ww3_multi (for parallel jobs) with unstructured mesh many times before (standalone cases or coupled) with no issues. Has anything changed in the recent versions of WW3? We run ww3_multi on large domains for parallel subgrid calculations. On small domains (when running as a serial job) we run ww3_shel (e.g., Shinnecock inlet simulations in UFS-Coastal). Is this a correct approach?
— Reply to this email directly, view it on GitHub https://github.com/oceanmodeling/ufs-coastal/issues/24#issuecomment-1865351120, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4J7D7RBNUFJ3O242YW4O6TYKOGK7AVCNFSM6AAAAABAQG6ODGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRVGM2TCMJSGA . You are receiving this because you were mentioned.Message ID: @.***>
@AliS-Noaa Thank you Ali, we appreciate your help and time
Let me provide you guys a brief summary of the situation, then you can determine which way you need to go.
ww3_shel is the parallel program that supports both curvilinear grids and unstructured meshes. You can have one single grid or mesh in it.
ww3_multi is the program to have nesting capabilities, so you can have one or more than one curvilinear grids. However it supports one single unstructured mesh.
The old ufs-coastal or the coastal-app, we developed back to 2017-2020 worked with extended wmesmf cap (originally developed by navy and we extended it to support unstructured meshes). This cap works only with ww3-multi
The new meshcap, developed by @deniseworthen support ww3_shel. The initial application was global cases like gfsv17. This modernization was then discussed with NOS team, then @uturuncoglu came and tool this task (i assume) to update ufs-coastal to be able to utilize the meshcap. With this new feature, you can have mediator, and all the new features of ww3.
So, the question is what cap the current ufs-coastal is using? If it is wmesmf, you should use ww3_multi ot if it is meshcap, you need to use ww3-shel.
Note that multi and shel do the same thing.
Regarding the question anout nml or inp, you should use one of them and they are identical. The ww3 community is trying to migrate to nml, which is much more advanced but as i said they do the same.
If both files (ww3_shel.inp and ww3_shel.nml) are in the work directory, the model uses ww3_shel.inp
Note that wmesmf only works with ww3_multi.inp but i think the meshcap supports both nml and inp.
Hope I have clarified the mystery for you guys, if not, let me know and I'll try to help if my time allows.
Have a great holiday and let's play with waves in 2024
@aliabdolali thanks for detailed explanation. It is more clear to me now. At this point, I am trying to move to mesh cap for UFS coastal and using single rather than multi namelist file since we don't have multi grid or nested domains at this point. I think this is also consistent with the other tests used under UFS weather model. We have working schism+ww3 configuration forced by data atmosphere with this approach and trying to port other ww3 configurations to the mesh cap.
@aliabdolali thanks for detailed explanation. It is more clear to me now. At this point, I am trying to move to mesh cap for UFS coastal and using single rather than multi namelist file since we don't have multi grid or nested domains at this point. I think this is also consistent with the other tests used under UFS weather model. We have working schism+ww3 configuration forced by data atmosphere with this approach and trying to port other ww3 configurations to the mesh cap.
You are on the right track
@aliabdolali Ali, Thank you so much for the clarifications and your time. As usual your help solves a lot of issues for us quickly and alleviates all our pains. I am working to "modernize" all the test cases in UFS-Coastal as well so, your explanations, as Ufuk mentioned, clarified a lot of things for us. Trying to work with 6 different model components at the same time makes things very difficult for us. Thank you so much for your help. I wish you all a Happy Holiday season and a happy, fruitful 2024.
@aliabdolali Ali, Thank you so much for the clarifications and your time. As usual your help solves a lot of issues for us quickly and alleviates all our pains. I am working to "modernize" all the test cases in UFS-Coastal as well so, your explanations, as Ufuk mentioned, clarified a lot of things for us. Trying to work with 6 different model components at the same time makes things very difficult for us. Thank you so much for your help. I wish you all a Happy Holiday season and a happy, fruitful 2024.
I second Takis. Thanks @aliabdolali and @AliS-Noaa for your helps and support.
Also, just to note that using the mesh-cap also enables direct creation of netcdf "gridded" output (eg, 20210322.090000.out_grd.ww3.nc). This is enabled by add an attribute to WAV:
WAV_model: ww3 .... gridded_netcdfout = true ::
Denise
On Thu, Dec 21, 2023 at 8:32 AM Saeed Moghimi @.***> wrote:
@aliabdolali https://github.com/aliabdolali Ali, Thank you so much for the clarifications and your time. As usual your help solves a lot of issues for us quickly and alleviates all our pains. I am working to "modernize" all the test cases in UFS-Coastal as well so, your explanations, as Ufuk mentioned, clarified a lot of things for us. Trying to work with 6 different model components at the same time makes things very difficult for us. Thank you so much for your help. I wish you all a Happy Holiday season and a happy, fruitful 2024.
I second Takis. Thanks @aliabdolali https://github.com/aliabdolali and @AliS-Noaa https://github.com/AliS-Noaa for your helps and support.
— Reply to this email directly, view it on GitHub https://github.com/oceanmodeling/ufs-coastal/issues/24#issuecomment-1866250306, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJU7JZGQT2HURR6UBEXWFFDYKQ273AVCNFSM6AAAAABAQG6ODGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNRWGI2TAMZQGY . You are receiving this because you were mentioned.Message ID: @.***>
-- Denise Worthen
Contractor with Lynker in support of NOAA/NWS/NCEP/EMC
Thank you very much for your great help! @aliabdolali @AliS-Noaa , I will use ww3_shel and ww3_shel.nml for my case.
Hi Ufuk, @uturuncoglu I have updated the input for the atm2sch2ww3 case in the folder /work2/noaa/nos-surge/yunfangs/stmp/yunfangs/FV3_RT/ufuk/coastal_ian_atm2ww3_intel_1 The case started on Sep 15, 2022, and ran 528 hours, could you please help me check and setup a run, thank you. @pvelissariou1 the case location is as above.
@yunfangsun this might be the wrong folder. in here I could only see datm+ww3 configuration. please double check. BTW, Are you able to run entire simulation? Do you want me to run for you? Also, I just wonder if you could check the results of datm+ww3 case that I run for you. Is there any issue with it?
@yunfangsun