foua-pps / level1c4pps

module to create level1c input files to PPS using satpy
GNU General Public License v3.0
3 stars 11 forks source link

FOr AVHRR data missing ch3a/b gives no image5 dataset #68

Closed TAlonglong closed 2 years ago

TAlonglong commented 2 years ago

I'm not sure if this is a feature, enhancement or bug.

AVHHR l1b only gives one of the ch3a/b channels and this sometimes gives netcdf files with missing image5 dataset.

This is fine for most purposes as far as I can understand.

But for HRW processing the individual segments from AVHRR needs to concatenate the channels into one netcdf file.

When some of the AVHRR netcdf segments are missing the image5 and some are not, the HRW prepare fails to concatenate these as the number of image datasets differs.

So again, maybe this needs to be fixed in the HRW preprocessing.

When I generate AVHRR netcdf files I do like this:

from level1c4pps.avhrr2pps_lib import process_one_scene as process_avhrr
process_avhrr(<l1b aapp file>, <out_directory>)

For dataset with ch3a/b I get:

hrpt_metop01_20220204_0740_48682.l1b 
level1c4pps INFO: |08:06:58|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:06:58|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:06:58|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:06:59|: Saving datasets to NetCDF4/CF.
/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/writers/cf_writer.py:572: FutureWarning: The default behaviour of the CF writer will soon change to not compress data by default.
  FutureWarning)
/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/dask/core.py:119: RuntimeWarning: divide by zero encountered in true_divide
  return func(*(_execute_task(a, cache) for a in args))
Saved file S_NWC_avhrr_metopb_00000_20220204T0740000Z_20220204T0740599Z.nc after 3.5 seconds

Giving netcdf

h5dump -H S_NWC_avhrr_metopb_48682_20220204T0740000Z_20220204T0740599Z.nc | grep DATASET
   DATASET "azimuthdiff" {
   DATASET "bnds_1d" {
   DATASET "image0" {
   DATASET "image1" {
   DATASET "image2" {
   DATASET "image3" {
   DATASET "image4" {
   DATASET "image5" {
   DATASET "lat" {
   DATASET "lon" {
   DATASET "satzenith" {
   DATASET "sunzenith" {
   DATASET "time" {
   DATASET "time_bnds" {
   DATASET "x" {
   DATASET "y" {

For dataset without one of the ch3a/b:

hrpt_metop01_20220204_0757_48682.l1b
level1c4pps INFO: |08:05:42|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:05:42|: No valid operational coefficients, fall back to pre-launch
level1c4pps INFO: |08:05:42|: No valid operational coefficients, fall back to pre-launch
level1c4pps ERROR: |08:05:42|: Could not load dataset 'DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=())': "Could not load DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=()) from any provided files"
Traceback (most recent call last):
  File "/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/readers/yaml_reader.py", line 848, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
  File "/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/readers/yaml_reader.py", line 720, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
  File "/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/readers/yaml_reader.py", line 706, in _load_dataset
    "Could not load {} from any provided files".format(dsid))
KeyError: "Could not load DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=()) from any provided files"
level1c4pps WARNING: |08:05:43|: The following datasets were not created and may require resampling to be generated: DataID(name='3b', wavelength=WavelengthRange(min=3.55, central=3.74, max=3.93, unit='µm'), resolution=1050, calibration=<calibration.brightness_temperature>, modifiers=())
level1c4pps INFO: |08:05:44|: Saving datasets to NetCDF4/CF.
/software/miniconda/envs/ppsv2021/lib/python3.7/site-packages/satpy/writers/cf_writer.py:572: FutureWarning: The default behaviour of the CF writer will soon change to not compress data by default.
  FutureWarning)
Saved file S_NWC_avhrr_metopb_00000_20220204T0757000Z_20220204T0757599Z.nc after 3.5 seconds

Giving netcdf

h5dump -H S_NWC_avhrr_metopb_48682_20220204T0757000Z_20220204T0757599Z.nc | grep DATASET
   DATASET "azimuthdiff" {
   DATASET "bnds_1d" {
   DATASET "image0" {
   DATASET "image1" {
   DATASET "image2" {
   DATASET "image3" {
   DATASET "image4" {
   DATASET "lat" {
   DATASET "lon" {
   DATASET "satzenith" {
   DATASET "sunzenith" {
   DATASET "time" {
   DATASET "time_bnds" {
   DATASET "x" {
   DATASET "y" {

Can you please comment on this, then I can decide how to solve this.

ninahakansson commented 2 years ago

Thanks for raising this issue @TAlonglong! I see how this is a problem for HRW. I think this is a problem that we should solve in HRWPrepare. We will discuss it in the PPS-team.