ucd-library / glm-lightning

Processing the AWS cloud lightning product
MIT License
0 stars 2 forks source link

Cannot postprocess downloaded files #4

Open prakashvs613 opened 2 years ago

prakashvs613 commented 2 years ago

@qjhart

While I am able to run the make file and download GLM data from aws servers, I am unable to use the NCL processor for .nc files.

qjhart commented 2 years ago

@prakashvs613 let's try and decode exactly where the problem is at: Let's go ahead and pick one day as an example; Let's say 2019, Julian day 170.

If we break this into pieces; the first step is downloading the data: make yyyy=2019 j=170 files. This wants to run:

[[ -d GLM-L2-LCFA/2019/170 ]] || mkdir -p GLM-L2-LCFA/2019/170;\
aws s3 cp s3://noaa-goes17/GLM-L2-LCFA/2019/170 GLM-L2-LCFA/2019/170 --recursive --no-sign-request

After this we should have about 4320 files downloaded. All .nc files.

find GLM-L2-LCFA/2019/170 -type f | wc -l
#2 4320

Standard ncl utils should work, eg:

ncdump -h GLM-L2-LCFA/2019/170/01/OR_GLM-L2-LCFA_G17_s20191700153400_e20191700154000_c20191700154027.nc

results in:

netcdf OR_GLM-L2-LCFA_G17_s20191700153400_e20191700154000_c20191700154027 {
dimensions:
    number_of_flashes = UNLIMITED ; // (274 currently)
    number_of_groups = UNLIMITED ; // (5405 currently)
    number_of_events = UNLIMITED ; // (12467 currently)
    number_of_time_bounds = 2 ;
    number_of_field_of_view_bounds = 2 ;
    number_of_wavelength_bounds = 2 ;
variables:
    int event_id(number_of_events) ;
        event_id:long_name = "product-unique lightning event identifier" ;
        event_id:_Unsigned = "true" ;
        event_id:units = "1" ;
.....

We combine all these data into daily CSV summaries. To do this,we run an ncl script. That language is not super easy to use , (or more likely, I'm not super proficient in coding it). So we take the DDD.ncl template, and re-write that for every one f those 4K files. (for one day). One templated file in our case is shown below.

In this example, I've replaced the write_table commands with the print_table command for illustration below.

\;----------------------------------------------------------------------
; flashes_DDD.ncl

begin
;---NetCDF file to read in.
  filename = "GLM-L2-LCFA/2019/170/15/OR_GLM-L2-LCFA_G17_s20191701544000_e20191701544200_c20191701544217.nc"
  fin      = addfile(filename,"r")

;-- Add to the dataset table

product_time=fin->product_time
product_time_bounds=fin->product_time_bounds

;print(product_time_bounds)

print_table([/fin@id,fin@platform_ID,fin@dataset_name,fin@date_created,\
  fin@time_coverage_start,fin@time_coverage_end\
;  product_time,product_time_bounds(0),product_time_bounds(1) \
  /],\
  "%s,%s,%s,%s,%s,%s")

 flash_id=fin->flash_id
 time_offset_of_first_event=fin->flash_time_offset_of_first_event
 time_offset_of_last_event=fin->flash_time_offset_of_last_event
; frame_time_offset_of_first_event=fin->flash_frame_time_offset_of_first_event
; frame_time_offset_of_last_event=fin->flash_frame_time_offset_of_last_event
 lat=fin->flash_lat
 lon=fin->flash_lon
 area=fin->flash_area
 energy=fin->flash_energy
 quality_flag=fin->flash_quality_flag

dataset_id=conform(flash_id,fin@id,0)

flash_filename = "csv/2019/170/flashes.csv"

;---Write data to file
  alist  = [/ flash_id \
  ,dataset_id \
  ,time_offset_of_first_event \
  ,time_offset_of_last_event \
  ,lat \
  ,lon \
  ,area \
  ,energy \
  ,quality_flag \
/]
  format = "%hu,%s,%hu,%hu,%g,%g,%hu,%hu,%hhu"
  print_table(falist, format)
end

This is accomplished in the Makefile with make yyyy=2019 j=170 flashes . The 4K files for this day all get appended to the dataset.csv and flashes.csv table, but using the script above (with the print_table) this is what processing one file looks like:

ncl -Q DDD.ncl
989fe828-6bac-4a25-b470-2a816bc70490,G17,OR_GLM-L2-LCFA_G17_s20191701544000_e20191701544200_c20191701544217.nc,2019-06-19T15:44:21.7Z,2019-06-19T15:44:00.0Z,2019-06-19T15:44:20.0Z
4297,989fe828-6bac-4a25-b470-2a816bc70490,12886,12891,6.33958,-90.0676,606,8,0
4295,989fe828-6bac-4a25-b470-2a816bc70490,11676,13193,-18.9645,-164.393,3007,99,0
4298,989fe828-6bac-4a25-b470-2a816bc70490,13074,14069,6.39404,-90.6173,2387,348,0
4299,989fe828-6bac-4a25-b470-2a816bc70490,13203,14204,10.5352,-83.9197,3563,389,0
4296,989fe828-6bac-4a25-b470-2a816bc70490,12786,14396,-29.7502,-154.952,2868,352,0
4301,989fe828-6bac-4a25-b470-2a816bc70490,14632,15439,6.27679,-90.0894,1816,167,0
4304,989fe828-6bac-4a25-b470-2a816bc70490,15866,16142,-28.0518,-156.522,472,5,0
.... many more

That's how you can test your ncl processing and generate the CSV files.