NCAR / obs2ioda

Converter developed by @jamiebresch to transform conventional and remote sensing observations in different formats to the IODA format needed in JEDI
4 stars 4 forks source link

obs2ioda-v2.x processing limited to one 6h window #20

Open st-ncar opened 5 days ago

st-ncar commented 5 days ago

obs2ioda-v2.x seems to be able to read concatenated BUFR files that contain data from multiple 6h time windows. However, the data processing routines appear to assume that the BUFR files contain a single 6h time window and obs2ioda-v2.x segfaults when concatenated BUFR files are processed.

Example: consider the two BUFR files {gdas.satwnd.t00z.20180415.bufr, gdas.satwnd.t06z.20180415.bufr} from /glade/campaign/collections/rda/data/ds351.0/bufr/2018/ Processing each file in isolation works as expected (-split is optional)

$ ./obs2ioda-v2.x -split gdas.satwnd.t00z.20180415.bufr
$ ./obs2ioda-v2.x -split gdas.satwnd.t06z.20180415.bufr

One may expect that the BUFR files can be concatenated and processed together (-split is again optional, the segfault occurs regardless of this option):

$ cat gdas.satwnd.t00z.20180415.bufr gdas.satwnd.t06z.20180415.bufr > gdas.satwnd.combined.bufr
$ ./obs2ioda-v2.x -split gdas.satwnd.combined.bufr

In this case, obs2ioda-v2.x fails with a segmentation error:

--- reading ./gdas.satwnd.combined.bufr ---
 ./gdas.satwnd.combined.bufr file date is: 2018041500
obs_errtable file is found. Will use user-provided obs errors.
 num_report_infile ./gdas.satwnd.combined.bufr :    1082216
 --- applying some additional QC as in GSI read_satwnd.f90 for the global model
 ---
 min time: 20180414210000
 max time: 20180415030000
 --- sorting satwnd obs...
forrtl: severe (408): fort: (2): Subscript #2 of the array NLOCS has value 8 which is greater than the upper bound of 7

Image              PC                    Routine                Line         Source
obs2ioda-v2.x      000000000048AEF2      satwnd_mod_mp_sor      421          satwnd_mod.f90
obs2ioda-v2.x      00000000004BBC7A      MAIN__                 107          main.f90
obs2ioda-v2.x      0000000000402F2D      Unknown                Unknown      Unknown
libc-2.31.so       0000145DC3F5629D      __libc_start_main      Unknown      Unknown
obs2ioda-v2.x      0000000000402E5A      Unknown                Unknown      Unknown

The output suggests that obs2ioda-v2.x reads all data entries in the concatenated BUFR file but fails to process them properly. I think the issue is that filedate is only read from the first BUFR message (line 141 in satwnd_mod.f90) and that the observation window is hard coded to be 6h long (e.g. line 74 in main.f90 and lines 29, 30 in define_mod.f90). The time slots are then +/- 3h from the filedate of the first BUFR message. Entries from subsequent BUFR messages are associated with times larger than the last time slot, which leads to out-of-bound indices in line 405 of satwnd_mod.f90 and causes the segmentation fault (note: line number in the error message is different because I added some code for debugging).

The current code behavior is only an issue if we want to process concatenated BUFR files. In the above example, this may be desirable to write the observations from 2.30-3.30 (parts of which are stored in each BUFR file) into a single hdf5 file. In this sense, the issue is connected to #19.

Perhaps it would be helpful to either check if the BUFR file only contains one message or generalize the code to handle multiple messages (I am not sure if this is needed, especially if the file I/O described in #19 is improved).

Adding @liujake @junmeiban @ibanos90 Please let me know if I am missing something or if you disagree with my interpretation. Thank you!

ibanos90 commented 4 days ago

Hi @st-ncar, I think this could be related to the fact that the length of the time window is hard coded (6): https://github.com/NCAR/obs2ioda/blob/2d8033b882751638ace52f723efc4d5a96ece9b6/obs2ioda-v2/src/main.f90#L74 I wonder if you use a larger time window, it may work for concatenated files containing more than 6h window (-3 to +3). Not sure if this will solve the issue, but may be worth trying.