NCAR / hrldas

HRLDAS (High Resolution Land Data Assimilation System)
Other
50 stars 53 forks source link

Regarding Preprocessing using create_uv.perl file #25

Closed vinni94 closed 1 year ago

vinni94 commented 2 years ago

Hi, This issue is raised because. while compiling fortran document in the perl file create_UV.perl, it uses a linked library path something like L/home/user/Documents/NCEPLIBS-bacio-develop/NCEPLIBS-w3emc-develop/w3emc/lib -lw. this throws an error 'undefined -lw'. I am not able to figure out which library is this trying to use. After some research I found this NCEP libs used along with fortran programs called bacio and w3emc. I have installed these libraries. however even when I link the lib directories of these installations the error remains the same. Please help

CharlesZheZhang commented 2 years ago

Hi Vinni,

It looks like you are using the GLDAS forcing. Please note that the GLDAS forcing data switched from grib to netcdf format two years ago. For this change, now there is a separate script, create_UV_netcdf.perl, to deal with the wind component. Can you try this Perl script and see if it works?

The details about pre-processing GLDAS in netcdf format are documented in this file: https://github.com/NCAR/hrldas/blob/master/hrldas/docs/README.GLDAS

vinni94 commented 2 years ago

Dear Zhezhang, All the preprocessing including create_UV.perl and combine_precipition.perl has run successfully. Only issue is when I finally run the create_forcing_netcdf.exe. It throws an error message: "unrecognised variable Tair_f_inst"

Also, I had to do a minor edit in the Makefile in order to compile create_forcing_netcdf.F instead of create_forcing.F Thanks much for your help and quick response

On Wed, Sep 28, 2022, 02:49 ZheZhang @.***> wrote:

Hi Vinni,

It looks like you are using the GLDAS forcing. Please note that the GLDAS forcing data switched from grib to netcdf format two years ago. For this change, now there is a separate script, create_UV_netcdf.perl, to deal with the wind component. Can you try this Perl script and see if it works?

The details about pre-processing GLDAS in netcdf format are documented in this file: https://github.com/NCAR/hrldas/blob/master/hrldas/docs/README.GLDAS

— Reply to this email directly, view it on GitHub https://github.com/NCAR/hrldas/issues/25#issuecomment-1260065676, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJ276AA7XJGOB6HTZTXS6IDWANQFTANCNFSM6AAAAAAQVR7XLE . You are receiving this because you authored the thread.Message ID: @.***>

vinni94 commented 2 years ago

Here are the last few lines of error:

flnm = geo_em.d01.nc
 Done with subroutine read_geo_em_file
 Date = 2018-01-03_00  ihour =            0
             :  Checking for file '/media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_2018010300'
             :  Found file /media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_2018010300
 Succesfully read the file:
 /media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_20
 18010300

 Can't read variable:
 Tair_f_inst

 Returning error flag from get_single_datastruct_from_netcdf (1)
 flnm =
 /media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_20
 18010300
 ierr =          -57
Field label= T2D
No previous data.
CharlesZheZhang commented 2 years ago

Hi Vinni, This error message basically means that the code had an error when reading the variable in the netcdf file, so it returned an ierr /= 0. Can you check that all variables are extracted properly from GLDAS in the previous steps? Also, you can attach the files and the namelist to this issue. See if I can help you with this.

Regards, Zhe

vinni94 commented 2 years ago

Dear Zhezhang, Thanks for the help. I am hereby enclosing the following docs:

  1. GLDAS sample files after using all the new perl scripts for netcdf processing.
  2. Namelist file along with geo_em.d01 file for my domain. Also, i have attached a small test fortran file to check whether the variable is being read properly (named test_nc.F : compiled and linked with netcdf 4.8.x version). Its reading properly here but not in the create_forcing_netcdf.F file.

GLDAS_sample_data.zip namelist_geo_em.zip

vinni94 commented 2 years ago

PS: Should global GLDAS need to be downloaded for forcing? I have used subsetted GLDAS data only for my region.

CharlesZheZhang commented 2 years ago

Hi Vinni,

This is the case - the read_netcdf_unit subroutine in the create_forcing_netcdf.F code requires exactly the same dimension (600x1440) for global coverage. Please try with the global data and see if it works.

Zhe

vinni94 commented 2 years ago

That's great. Thank you Zhe. 😄 I will try doing the same with global data.

vinni94 commented 2 years ago

Dear CharlesZheZhang, Thanks for your help so far. I was able to run HRLDAS successfully after I downloaded the global GLDAS files. I have a small query regarding the HRLDAS name list. I can see that there is a variable name SPINUP_LOOPS in the namelist of hrldas, which I guess, is used for model spinup. SPINUP_LOOPS has units as hours/days.

  1. Let's say I want to coldstart/spinup my model for 2 years * 20 times. What value of SPINUP_LOOPS should I give? After spinup, should I use the last restart file generated for my analysis runs?
  2. When I tested spinup for lets stay 16 days. the model is running each day 30 times. how can I achieve what I mentioned in question 1.?
cenlinhe commented 1 year ago

Hi Vinni, to answer your questions:

  1. set SPINUP_LOOPS=20 and KDAYS=730 (i.e., 365*2 if no leap year). After spinup, you need to use the latest restart file generated (you can check the time of each restart file generated and use the most recent one generated).
  2. The spinup capability is to repeat the run for the "KDAYS" you specified. So if you want to run each day 30 times, then you need to set SPINUP_LOOPS=30 and KDAYS=1 for spinup.