insarlab / MintPy

Miami InSAR time-series software in Python
https://mintpy.readthedocs.io
Other
616 stars 260 forks source link

Encountered the following error when importing GAMMA data:ValueError: required dataset "height" is missing in file geometryGeo.h5 #1271

Closed songzwgithub closed 1 week ago

songzwgithub commented 2 months ago

Hello everyone,

It seems that I cannot generate the geometry H5 file correctly when importing GAMMA data.

Full error message:


___________________________________________________________

  /##      /## /##             /##     /#######
 | ###    /###|__/            | ##    | ##__  ##
 | ####  /#### /## /#######  /######  | ##  \ ## /##   /##
 | ## ##/## ##| ##| ##__  ##|_  ##_/  | #######/| ##  | ##
 | ##  ###| ##| ##| ##  \ ##  | ##    | ##____/ | ##  | ##
 | ##\  # | ##| ##| ##  | ##  | ## /##| ##      | ##  | ##
 | ## \/  | ##| ##| ##  | ##  |  ####/| ##      |  #######
 |__/     |__/|__/|__/  |__/   \___/  |__/       \____  ##
                                                 /##  | ##
                                                |  ######/
   Miami InSAR Time-series software in Python    \______/
          MintPy 1.6.1, 2024-07-31
___________________________________________________________

--RUN-at-2024-09-27 23:37:32.966171--
Current directory: /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy
Run routine processing with smallbaselineApp.py on steps: ['load_data', 'modify_network', 'reference_point', 'quick_overview', 'correct_unwrap_error', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5']
Remaining steps: ['modify_network', 'reference_point', 'quick_overview', 'correct_unwrap_error', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5']
--------------------------------------------------
Go to work directory: /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy
read default template file: /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/smallbaselineApp.cfg

******************** step - load_data ********************

load_data.py --template /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/smallbaselineApp.cfg
processor : gamma
SAR platform/sensor : unknown from project name "None"
--------------------------------------------------
prepare metadata files for gamma products
prep_gamma.py "interferograms/*/diff_*rlks.unw" --dem "geometry/sim_*rlks.rdc.dem"
prep_gamma.py "interferograms/*/filt_*rlks.cor" --dem "geometry/sim_*rlks.rdc.dem"
prep_gamma.py "geometry/sim_*rlks.rdc.dem" --dem "geometry/sim_*rlks.rdc.dem"
grab LAT/LON_REF1/2/3/4 from par file: /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/geometry/20201112_10rlks.amp.par
prep_gamma.py "geometry/sim_*rlks.UTM_TO_RDC" --dem "geometry/sim_*rlks.rdc.dem"
writing >>> sim_20201112_10rlks.UTM_TO_RDC.rsc
prep_gamma.py "geometry/sim_*rlks.UTM_TO_RDC" --dem "geometry/sim_*rlks.rdc.dem"
writing >>> sim_20201112_10rlks.UTM_TO_RDC.rsc
--------------------------------------------------
updateMode : True
compression: None
multilook x/ystep: 1/1
multilook method : nearest
--------------------------------------------------
searching geometry files info
input data files:
height          : geometry/sim_20201112_10rlks.rdc.dem
rangeCoord      : geometry/sim_20201112_10rlks.UTM_TO_RDC
azimuthCoord    : geometry/sim_20201112_10rlks.UTM_TO_RDC
--------------------------------------------------
create HDF5 file /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/inputs/geometryGeo.h5 with w mode
create dataset /rangeCoord         of <class 'numpy.float32'>   in size of (2397, 3919) with compression = lzf
create dataset /azimuthCoord       of <class 'numpy.float32'>   in size of (2397, 3919) with compression = lzf
geocoded input, use constant value from metadata INCIDENCE_ANGLE
prepare slantRangeDistance ...
Finished writing to /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/inputs/geometryGeo.h5
--------------------------------------------------
create HDF5 file /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/inputs/geometryRadar.h5 with w mode
create dataset /height             of <class 'numpy.float32'>   in size of (1060, 1200) with compression = lzf
Finished writing to /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/inputs/geometryRadar.h5
--------------------------------------------------
searching interferogram pairs info
input data files:
unwrapPhase     : interferograms/*/diff_*rlks.unw
coherence       : interferograms/*/filt_*rlks.cor
number of unwrapPhase     : 945
number of coherence       : 945
All date12   exists in file ifgramStack.h5 with same size as required, no need to re-load.
--------------------------------------------------
searching ionosphere pairs info
input data files:
WARNING: No data files found for the required dataset: ['unwrapPhase']! Skip loading for ionosphere stack.
--------------------------------------------------
searching offset pairs info
input data files:
WARNING: No data files found for the required dataset: ['rangeOffset', 'azimuthOffset']! Skip loading for offset stack.
time used: 00 mins 4.4 secs.

Traceback (most recent call last):
  File "/home/s/anaconda3/envs/sentinel/bin/smallbaselineApp.py", line 10, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/s/anaconda3/envs/sentinel/lib/python3.12/site-packages/mintpy/cli/smallbaselineApp.py", line 209, in main
    run_smallbaselineApp(inps)
  File "/home/s/anaconda3/envs/sentinel/lib/python3.12/site-packages/mintpy/smallbaselineApp.py", line 1155, in run_smallbaselineApp
    app.run(steps=inps.runSteps)
  File "/home/s/anaconda3/envs/sentinel/lib/python3.12/site-packages/mintpy/smallbaselineApp.py", line 908, in run
    self.run_load_data(sname)
  File "/home/s/anaconda3/envs/sentinel/lib/python3.12/site-packages/mintpy/smallbaselineApp.py", line 182, in run_load_data
    stack_file, geom_file, _, ion_file = ut.check_loaded_dataset(self.workDir, print_msg=True)[:4]
                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/s/anaconda3/envs/sentinel/lib/python3.12/site-packages/mintpy/utils/utils.py", line 89, in check_loaded_dataset
    raise ValueError(f'required dataset "{dname}" is missing in file {geom_file}')
ValueError: required dataset "height" is missing in file /media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/inputs/geometryGeo.h5

The template used is as follows:

mintpy.load.processor       = gamma  #[isce, aria, hyp3, gmtsar, snap, gamma, roipac, nisar], auto for isce
mintpy.load.autoPath        = auto  #[yes / no], auto for no, use pre-defined auto path
mintpy.load.updateMode      = auto  #[yes / no], auto for yes, skip re-loading if HDF5 files are complete
mintpy.load.compression     = auto  #[gzip / lzf / no], auto for no.
##---------interferogram stack:
mintpy.load.unwFile         = interferograms/*/diff_*rlks.unw  #[path pattern of unwrapped interferogram files]
mintpy.load.corFile         = interferograms/*/filt_*rlks.cor  #[path pattern of spatial coherence       files]
mintpy.load.connCompFile    = auto  #[path pattern of connected components    files], optional but recommended
mintpy.load.intFile         = auto  #[path pattern of wrapped interferogram   files], optional
mintpy.load.magFile         = auto  #[path pattern of interferogram magnitude files], optional
##---------geometry:
mintpy.load.demFile         = geometry/sim_*rlks.rdc.dem  #[path of DEM file]
mintpy.load.lookupYFile     = geometry/sim_*rlks.UTM_TO_RDC  #[path of latitude /row   /y coordinate file], not required for geocoded data
mintpy.load.lookupXFile     = geometry/sim_*rlks.UTM_TO_RDC  #[path of longitude/column/x coordinate file], not required for geocoded data
mintpy.load.incAngleFile    = auto  #[path of incidence angle file], optional but recommended
mintpy.load.azAngleFile     = auto  #[path of azimuth   angle file], optional
mintpy.load.shadowMaskFile  = auto  #[path of shadow mask file], optional but recommended
mintpy.load.waterMaskFile   = auto  #[path of water  mask file], optional but recommended
mintpy.load.bperpFile       = auto  #[path pattern of 2D perpendicular baseline file], optional

System information

Operating system: Linux
Python environment: conda
MintPy version: 1.6.1
InSAR processor/product: gamma 

I would greatly appreciate it if anyone could provide some advice.

codeautopilot[bot] commented 2 weeks ago

Potential solution

The bug is caused by the absence of the "height" dataset in the geometryGeo.h5 file, which is crucial for processing GAMMA data. The solution involves ensuring that the DEM data, which contains elevation information, is correctly processed and written into the geometryGeo.h5 file as the "height" dataset. This requires identifying the part of the code responsible for handling DEM data and ensuring it extracts and writes the "height" information into the HDF5 file.

What is causing this bug?

The bug is caused by the failure to include the "height" dataset in the geometryGeo.h5 file. This dataset is typically derived from DEM data, which provides elevation information. The error suggests that the part of the code responsible for processing DEM data and writing it to the HDF5 file is not functioning correctly, resulting in the missing "height" dataset.

Code

To address this issue, we need to ensure that the DEM data is processed correctly and the "height" dataset is written to the geometryGeo.h5 file. Here is a potential code snippet to achieve this:

import h5py
import numpy as np

def write_height_to_hdf5(dem_file, hdf5_file):
    # Load DEM data
    dem_data = np.loadtxt(dem_file)  # Assuming DEM data is in a text format

    # Open the HDF5 file
    with h5py.File(hdf5_file, 'a') as f:
        # Check if "height" dataset already exists
        if "height" not in f:
            # Create the "height" dataset
            f.create_dataset("height", data=dem_data, dtype='float32', compression='lzf')
        else:
            print("Height dataset already exists in the HDF5 file.")

# Example usage
dem_file_path = 'geometry/sim_20201112_10rlks.rdc.dem'
hdf5_file_path = '/media/s/新加卷/DATA_process/cangzhou/asc/P142F116/SLC/mintpy/inputs/geometryGeo.h5'
write_height_to_hdf5(dem_file_path, hdf5_file_path)

This code snippet assumes that the DEM data is stored in a text format and reads it into a NumPy array. It then opens the HDF5 file in append mode and checks if the "height" dataset exists. If not, it creates the dataset using the DEM data.

How to replicate the bug

  1. Set up the environment with the necessary dependencies, including MintPy and its required libraries.
  2. Prepare the GAMMA data, including the interferograms and DEM files, in the specified directory structure.
  3. Run the smallbaselineApp.py script with the provided configuration file (smallbaselineApp.cfg).
  4. Observe the error message indicating that the "height" dataset is missing in the geometryGeo.h5 file.

By following these steps, you should encounter the same error, confirming the bug's presence.

Click here to create a Pull Request with the proposed solution

Files used for this task:

Changes on src/mintpy/cli/prep_gamma.py The file `prep_gamma.py` is a command-line interface script designed to prepare attribute files for Gamma products. It primarily handles metadata files associated with interferograms and DEM files. The script does not directly handle the creation or modification of HDF5 files, such as `geometryGeo.h5`, which is where the missing "height" dataset error is occurring. ### Analysis: 1. **Functionality**: The script is responsible for preparing metadata files for Gamma products, which includes checking and generating necessary metadata files for interferograms and DEM files. It does not directly manipulate or create datasets within HDF5 files. 2. **DEM Handling**: The script accepts a DEM file as an input parameter, which is relevant because the "height" dataset is typically derived from DEM data. However, the script does not contain logic to extract or write the "height" dataset to an HDF5 file. 3. **Error Context**: The error reported by the user indicates that the "height" dataset is missing from the `geometryGeo.h5` file. This suggests that the issue lies in the part of the codebase responsible for processing DEM data and writing it to the HDF5 file, rather than in this script. ### Conclusion: - The `prep_gamma.py` script does not appear to be directly responsible for the bug related to the missing "height" dataset in the `geometryGeo.h5` file. The issue likely resides in another part of the codebase that handles the conversion of DEM data into HDF5 datasets. ### Recommendation: - Investigate the part of the codebase that processes DEM files and writes to the `geometryGeo.h5` file. Ensure that the "height" data from the DEM is correctly extracted and included in the HDF5 file. This might involve reviewing the logic that handles DEM data conversion and HDF5 file writing.
yunjunz commented 2 weeks ago

It seems that the DEM height has been successfully loaded into the geometryRadar.h5 file, but the ut.check_loaded_dataset() is looking for geometryGeo.h5 instead, based on the coordinates information from ifgramStack.h5 file. Are your interferograms geocoded already? If so, you should specify a DEM file in the geo-coordinate to mintpy.load.demFile, to be consistent with your interferograms.

songzwgithub commented 1 week ago

It seems that the DEM height has been successfully loaded into the geometryRadar.h5 file, but the ut.check_loaded_dataset() is looking for geometryGeo.h5 instead, based on the coordinates information from ifgramStack.h5 file. Are your interferograms geocoded already? If so, you should specify a DEM file in the geo-coordinate to mintpy.load.demFile, to be consistent with your interferograms.

Thank you for your reply. The interferograms have not been geocoded. It seems that there was a problem with the prepared files, but when I deleted them all and reprocessed them, there were no issues.