insarlab / MintPy

Miami InSAR time-series software in Python
https://mintpy.readthedocs.io
Other
575 stars 252 forks source link

hdfeos5: latitude/longitude missing for hyp3 product #1049

Closed fukun364202818 closed 1 year ago

fukun364202818 commented 1 year ago

Description of the problem

I used save_hdfeos5.py timeseries_ERA5_demErr.h5 --tc temporalCoherence.h5 --asc avgSpatialCoh.h5 -g inputs/geometryGeo.h5 -t mintpy_config.txt to generate S1_IW12_128_0593_0597_20141213_20160524.he5。 Output information is as follows:

read options from template file: mintpy_config.txt
## UNAVCO Metadata:
-----------------------------------------
  atmos_correct_method      None
  beam_mode                 IW
  beam_swath                0
  data_footprint            POLYGON((205760.0 3468480.0,496720.0 3468480.0,496720.0 3241840.0,205760.0 3241840.0,205760.0 3468480.0))
  first_date                2022-06-10
  first_frame               0
  flight_direction          A
  history                   2023-07-26
  last_date                 2023-06-05
  last_frame                0
  look_direction            R
  mission                   S1
  polarization              Unknown
  post_processing_method    MintPy
  prf                       0.0
  processing_dem            Unknown
  processing_software       hyp3
  processing_type           LOS_TIMESERIES
  relative_orbit            127
  scene_footprint           POLYGON((205760.0 3241840.0,205760.0 3468480.0,496720.0 3468480.0,496720.0 3241840.0,205760.0 3241840.0))
  unwrap_method             Unknown
  wavelength                0.055465764662349676

-----------------------------------------
create HDF5 file: S1_IW_127_0000_20220610_20230605.he5 with w mode
create group   /HDFEOS/GRIDS/timeseries/observation
create dataset /HDFEOS/GRIDS/timeseries/observation/displacement        of float32    in size of (30, 2833, 3637) with compression=lzf
write data acquition by acquition ...
[==================================================] 30/30 20230605   21s /     0s
create dataset /HDFEOS/GRIDS/timeseries/observation/date               of |S8        in size of (30,) with compression=lzf
create dataset /HDFEOS/GRIDS/timeseries/observation/bperp              of float32    in size of (30,) with compression=lzf
create group   /HDFEOS/GRIDS/timeseries/quality
create dataset /HDFEOS/GRIDS/timeseries/quality/temporalCoherence      of float32    in size of (2833, 3637) with compression=lzf
create dataset /HDFEOS/GRIDS/timeseries/quality/avgSpatialCoherence    of float32    in size of (2833, 3637) with compression=lzf
create dataset /HDFEOS/GRIDS/timeseries/quality/mask                   of bool       in size of (2833, 3637) with compression=lzf
create group   /HDFEOS/GRIDS/timeseries/geometry
create dataset /HDFEOS/GRIDS/timeseries/geometry/azimuthAngle          of float32    in size of (2833, 3637) with compression=lzf
create dataset /HDFEOS/GRIDS/timeseries/geometry/height                of float32    in size of (2833, 3637) with compression=lzf
create dataset /HDFEOS/GRIDS/timeseries/geometry/incidenceAngle        of float32    in size of (2833, 3637) with compression=lzf
create dataset /HDFEOS/GRIDS/timeseries/geometry/slantRangeDistance    of float32    in size of (2833, 3637) with compression=lzf
create dataset /HDFEOS/GRIDS/timeseries/geometry/waterMask             of bool       in size of (2833, 3637) with compression=lzf
write metadata to root level
finished writing to S1_IW_127_0000_20220610_20230605.he5

Then I used python hdfeos5_2json_mbtiles.py --num-workers 8 file ../2023_chengdu/S1_IW_127_0000_20220610_20230605.he5 outputDir ../2023_chengdu/OutPut to convert the data。

Full script that generated the error

hdfeos5_2json_mbtiles.py

Full error message

reading displacement data from file: ../2023_chengdu/S1_IW_127_0000_20220610_20230605.he5 ...
reading mask data from file: ../2023_chengdu/S1_IW_127_0000_20220610_20230605.he5 ...
Masking displacement
Creating shared memory for multiple processes
../2023_chengdu/test_output already exists
Traceback (most recent call last):
  File "/home/PycharmProjects/hyp3_mintpy/insarmaps_scripts/hdfeos5_2json_mbtiles.py", line 413, in <module>
    main()
  File "/home/PycharmProjects/hyp3_mintpy/insarmaps_scripts/hdfeos5_2json_mbtiles.py", line 387, in main
    lats = np.array(f["HDFEOS"]["GRIDS"]["timeseries"]["geometry"]["latitude"])
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/home/anaconda3/envs/hyp3-mintpy/lib/python3.9/site-packages/h5py/_hl/group.py", line 357, in __getitem__
    oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5o.pyx", line 190, in h5py.h5o.open
KeyError: "Unable to synchronously open object (object 'latitude' doesn't exist)"
(hyp3-mintpy) root@ecs-65ca-0420274:/home/PycharmProjects/hyp3_mintpy/insarmaps_scripts# /home/anaconda3/envs/hyp3-mintpy/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 1 leaked shared_memory objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '

System information

yunjunz commented 1 year ago

Please fill in all the items in the issue template.

fukun364202818 commented 1 year ago

Please fill in all the items in the issue template.

I have updated my question, thanks.

yunjunz commented 1 year ago

Thank you @fukun364202818 for the persistence on the issue reporting. With your updated description, I was able to locate the cause to hyp3+mintpy workflow, and reproduce your HDF-EOS5 file structure. The two PRs above should have fixed the issue. Cheers!

fukun364202818 commented 1 year ago

@yunjunz Thank you very much,the bug has been fixed。Cheers!