Open AndrewHazelton opened 4 months ago
I don't have read permissions for your files here: /work2/noaa/aoml-hafs1/ahazelto/hafs-input/GFS_FNL/gfs.20080910/00/atmos/
I have updated the permissions - it should be readable now.
I see what is happening. Doing a dump of fnl_20080910_00_00.grib2, I see what looks like soil temperature in records 235-238:
235:9633444:d=2008091000:TMP Temperature [K]:0-0.1 m below ground:anl:
236:9658101:d=2008091000:TMP Temperature [K]:0.1-0.4 m below ground:anl:
237:9682412:d=2008091000:TMP Temperature [K]:0.4-1 m below ground:anl:
238:9706791:d=2008091000:TMP Temperature [K]:1-2 m below ground:anl:
If I check these records for the GRIB2 discipline, I see '0'- meteorological products:
235:9633444:code table 0.0=0 Meteorological Products
236:9658101:code table 0.0=0 Meteorological Products
237:9682412:code table 0.0=0 Meteorological Products
238:9706791:code table 0.0=0 Meteorological Products
chgres_cube
assumes soil temperature is disciple '2' - land products:
https://www.nco.ncep.noaa.gov/pmb/docs/grib2/grib2_doc/grib2_table4-1.shtml#2
So, to get chgres_cube
past this step, some code mods will be required. There may be other issues that will require additional work.
I see - is there a way to get chgres_cube to use those even though they are meteorological data?
I see - is there a way to get chgres_cube to use those even though they are meteorological data?
Not without some code modifications. Are you using the head of develop?
Yeah it's the support/HAFS branch, which I believe was branched off of develop: commit e6eec36f5b51dee2f3395b13e31a695cf83c8a3f (HEAD, origin/support/HAFS, origin/HEAD, support/HAFS)
Yeah it's the support/HAFS branch, which I believe was branched off of develop: commit e6eec36 (HEAD, origin/support/HAFS, origin/HEAD, support/HAFS)
The code mods to get past the error should be minimal. But there could be other issues with this old data. I can try to run the case myself. Don't get rid of the input data or the fort.41 namelist. Make sure I have read permissions on the data.
OK, I will leave it untouched for now.
Andy
On Thu, May 30, 2024 at 1:08 PM GeorgeGayno-NOAA @.***> wrote:
Yeah it's the support/HAFS branch, which I believe was branched off of develop: commit e6eec36 https://github.com/ufs-community/UFS_UTILS/commit/e6eec36f5b51dee2f3395b13e31a695cf83c8a3f (HEAD, origin/support/HAFS, origin/HEAD, support/HAFS)
The code mods to get past the error should be minimal. But there could be other issues with this old data. I can try to run the case myself. Don't get rid of the input data or the fort.41 namelist. Make sure I have read permissions on the data.
— Reply to this email directly, view it on GitHub https://github.com/ufs-community/UFS_UTILS/issues/954#issuecomment-2140302348, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMCUFFG7JMTYDER7M4VYCH3ZE5MILAVCNFSM6AAAAABIRA4N7OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNBQGMYDEMZUHA . You are receiving this because you authored the thread.Message ID: @.***>
I have updated the permissions - it should be readable now.
I can't get to your working directory. Can you leave the fort.41 file somewhere for me and make sure I can read all the files it points to? Thanks.
I changed the working directory to be readable - it's here: /work2/noaa/aoml-hafs1/ahazelto/hafstmp2024/HAFS_tutorial_test0_oldgfs/2008091000/09L/atm_ic/.
Let me know if you still can't access it.
I changed the working directory to be readable - it's here: /work2/noaa/aoml-hafs1/ahazelto/hafstmp2024/HAFS_tutorial_test0_oldgfs/2008091000/09L/atm_ic/.
Let me know if you still can't access it.
Got stopped at this point: drwxr-s--- 12 ahazelto aoml-hafs1 4096 May 29 15:55 hafstmp2024
OK looks better now?
drwxr-sr-- 12 ahazelto aoml-hafs1 4096 May 29 15:55 hafstmp2024/
OK looks better now?
drwxr-sr-- 12 ahazelto aoml-hafs1 4096 May 29 15:55 hafstmp2024/
Try chmod 755
.
I did a chmod -R 755 on /work2/noaa/aoml-hafs1/ahazelto/hafstmp2024/HAFS_tutorial_test0_oldgfs/. Does that work?
I did a chmod -R 755 on /work2/noaa/aoml-hafs1/ahazelto/hafstmp2024/HAFS_tutorial_test0_oldgfs/. Does that work?
Yes. I am able to reproduce the error. Don't remove any of your files.
My test data and script is here: /work2/noaa/da/ggayno/save/chgres.old.grib2
Hi George, just wanted to follow up on this and see if you had any further insights or suggestions? Thanks!
Hi George, just wanted to follow up on this and see if you had any further insights or suggestions? Thanks!
I documented what is happening here: https://github.com/ufs-community/UFS_UTILS/issues/954#issuecomment-2139931691
Code changes will be required to work with this data. I can assist if you want to try updating the code yourself.
Hello,
We are hoping to run HAFS on some older hurricanes (for example, Hurricane Ike in 2008). We were attempting to initialize off of the NCEP FNL analyses (https://rda.ucar.edu/datasets/ds083.2/). A lot of the variables in there seem to be the same ones that are in the current GFS analyses, and it did seem to be successfully creating some atmospheric ICs.
The log file for this test is on Orion: /work2/noaa/aoml-hafs1/ahazelto/logs/hafs_atm_ic.log
The files used for ICs/BCs are here: /work2/noaa/aoml-hafs1/ahazelto/hafs-input/GFS_FNL/gfs.20080910/00/atmos/
Is there a way to modify chgres to use these files, or provide a proxy for some of the needed soil variables?
Thanks,
Andy Hazelton