insarlab / MintPy

Miami InSAR time-series software in Python
https://mintpy.readthedocs.io
Other
584 stars 254 forks source link

GMTSAR: metadata preparation #452

Closed bjmarfito closed 3 years ago

bjmarfito commented 3 years ago

Requesting support for GMTSAR Sentinel-1 stack outputs.

welcome[bot] commented 3 years ago

👋 Thanks for opening your first issue here! Please make sure you filled out the template with as much detail as possible. We appreciate that you took the time to contribute! Make sure you read our contributing guidelines

yunjunz commented 3 years ago

GMTSAR support is definitely desired. Unfortunately, all our current developers don't have GMTSAR experience, nor do we have an example GMTSAR stack dataset to test on. @bjmarfito if you or others would like to contribute this support, I am more than happy to help and to answer any question.

bjmarfito commented 3 years ago

Thank you for the reply. Yes, I could start on the task next month. I'll update you regarding my progress.

yunjunz commented 3 years ago

Great! Feel free to contact me here or through emails @bjmarfito.

yunjunz commented 3 years ago

Here are the corresponding attributes used by mintpy/roipac and gmtsar:

mintpy/roipac gmtsar
ALOOKS grdinfo from below
ANTENNA_SIDE PRM/lookdir
AZIMUTH_PIXEL_SIZE SC_vel/PRF * Re/(Re+SC_height) * ALOOKS
CENTER_LINE_UTC PRM/clock_start/stop
EARTH_RADIUS PRM/earth_radius
HEADING optional, manual specify in tmpl file from proj_ra2ll_ascii.csh or Google Earth
HEIGHT PRM/SC_height
LAT_REF1 grdinfo from below
LAT_REF2 grdinfo from below
LAT_REF3 grdinfo from below
LAT_REF4 grdinfo from below
LON_REF1 grdinfo from below
LON_REF2 grdinfo from below
LON_REF3 grdinfo from below
LON_REF4 grdinfo from below
NCORRLOOKS optional and ignored for now
ORBIT_DIRECTION grab from PRM or return error for manual specify
PRF PRM/PRF
PROCESSOR hardwired as gmtsar
RANGE_PIXEL_SIZE SPEED_OF_LIGHT/rng_samp_rate/2 * RLOOKS
RLOOKS grdinfo from below
STARTING_RANGE PRM/near_range
WAVELENGTH PRM/radar_wavelength
DATE12 YYYYMMDD.prm files
P_BASELINE_TOP_HDR SAT_baseline PRM1 PRM2 baseline_start or baseline_table.dat
P_BASELINE_BOTTOM_HDR SAT_baseline PRM1 PRM2 baseline_start or baseline_table.dat

Questions:

  1. What's is "SC" stand for, in e.g. SC_height?
Xiaohua-Eric-Xu commented 3 years ago

Hi, Here are some parameters one could easily get from the produced files in GMTSAR. I could write a shell script to do these, but it might be better to keep things in python, to be consistent with the package.

In any interferogram folder, e.g. intf_all/day1_day2, *_ll.grd are geocoded netcdf grids (e.g. phasefilt_ll.grd), while those with no "_ll" are grids in radar coordinates (e.g. phasefilt.grd).

Some of the grds could be done with pygmt.gmt_command(). I don't know how to use it, but should't be quite different from the attached example commands below. The original PRM file may not exist in the merged directory if you are processing ScanSAR/TOPS data, you probably need to use the one in the subswath directory.

ALOOKS: For a given .grd file in radar coordinates, e.g. phasefilt.grd, x_inc. gmt grdinfo phasefilt.grd -C | awk '{print $8}' RLOOKS: For a given .grd file in radar coordinates, y_inc. gmt grdinfo phasefilt.grd -C | awk '{print $8}' ANTENNA_SIDE: For any PRM file, lookdir, R for right, L for left. grep lookdir S1_20141110_ALL_F1.PRM | awk '{print $3}' CENTER_LINE_UTC: For any PRM file. (SC_clock_start+SC_clock_stop)/2. (This is in format yyyyddd.ffffff..., where ddd is number of day counting from 0 fro Sentinel) (For older satellites, some are counting from 1). EARTH_RADIUS: For any PRM file, earth_radius. grep earth_radius S1_20141110_ALL_F1.PRM | awk '{print $3}' HEADING: !!This is not stored as a parameter. We could compute with proj_ra2ll_ascii.csh but it not stored anywhere. Or one could open images in Google Earth and measure the dir. Maybe specify this as an input parameter. HEIGHT: For any PRM file, SC_height. grep SC_height S1_20141110_ALL_F1.PRM | awk '{print $3}' Note SC_height_start/end denotes heights at the beginning and end of the acquisition. LAT_REF1: For a given .grd file in geo-coordinates, e.g. phasefilt_ll.grd, x_min. gmt grdinfo phasefilt_ll.grd -C | awk '{print $2}' LAT_REF2: For a given .grd file in geo-coordinates, x_max. gmt grdinfo phasefilt_ll.grd -C | awk '{print $3}' LAT_REF3: x_max. gmt grdinfo phasefilt_ll.grd -C | awk '{print $3}' LAT_REF4: x_min. gmt grdinfo phasefilt_ll.grd -C | awk '{print $2}' LON_REF1: y_max. gmt grdinfo phasefilt_ll.grd -C | awk '{print $5}' LON_REF2: y_max. gmt grdinfo phasefilt_ll.grd -C | awk '{print $5}' LON_REF3: y_min. gmt grdinfo phasefilt_ll.grd -C | awk '{print $4}' LON_REF4: y_min. gmt grdinfo phasefilt_ll.grd -C | awk '{print $4}' NCORRLOOKS: For any PRM, filter_wavelength^2. ORBIT_DIRECTION: For any PRM, orbdir. PRF: For any PRM, PRF. PROCESSOR: GMTSAR AZIMUTH_PIXEL_SIZE: This need to be computed. In PRM. SC_vel*earth_radius/(earth_radius+SC_height)/PRF. We do not store antenna size, and it is hard to account for the antenna sweeping or burst acquisition modes, so real resolution is hard to tell. RANGE_PIXEL_SIZE: This need to be computed. For any PRM file. 299792458/rng_samp_rate/2 for pixel size. 299792458/(chirp_slope*pulse_duration)/2 for resolution. STARTING_RANGE: For any PRM file, near_range. grep near_range S1_20141110_ALL_F1.PRM | awk '{print $3}'. This is the range used by focusing. Real range may differ depending on the delay of atm. WAVELENGTH: For any PRM file, radar_wavelength. grep radar_wavelength S1_20141110_ALL_F1.PRM | awk '{print $3}'

For baselines. They are not stored, but could be get from running SAT_baseline PRM1 PRM2, in each intferogram directory. The parameters in the output is, e.g. B_parallel = -18.309419218906 B_perpendicular = -23.036166125669 More detailed parameters are these baseline_start = 29.426175115075 baseline_center = 32.525720325876 baseline_end = 35.603622454575 alpha_start = 175.360226635715 alpha_center = 176.302804868524 alpha_end = 177.073455583871 Or one could use baseline_table.dat produced in batch mode, where parallel and perpendicular baselines (with respect to the master/reference image) are in 4th and 5th columns.

Q: SC stands for Space Craft, used in SC_height, etc.

Cheers, Xiaohua(Eric)

yunjunz commented 3 years ago

Thank you @Xiaohua-Eric-Xu for the detailed explanation!

From the example dataset that Becca shared with us, I could find all the parameters you described above, except for orbdir and filter_wavelength. Any idea?

yunjunz commented 3 years ago

CENTER_LINE_UTC: For any PRM file. (SC_clock_start+SC_clock_stop)/2. (This is in format yyyyddd.ffffff..., where ddd is number of day counting from 0 fro Sentinel) (For older satellites, some are counting from 1).

Does the interferogram folder name follow the same convention? More specifically, is Sentinel-1 the only satellite counting start from 0?

Is the .ffffff part the fraction of one day? We need to convert it into seconds of the time of the day.

Xiaohua-Eric-Xu commented 3 years ago

Hi @yunjunz, For filter_wavelength, it was my fault. It should be specified in the config file for processing, not in any PRM file. Yet the files Becca sent does not include that. For orbdir, my guess is that the ENVISAT metadata does not has that info, so it is not included in the PRM. Another nasty way of getting the orbdir is to check the last column in the corresponding LED file, which should be z_dot, in earth fixed reference frame. If it's negative, it's descending, if it's positive, it's ascending. Sorry for the confusion, Xiaohua(Eric)

Xiaohua-Eric-Xu commented 3 years ago

Hi @yunjunz, Yes, it is fraction of the day. Not only Sentinel-1, most newer satellite, i.e. TSX, RS2, CSK, Sentinel-1, does that, while older ones does not. We are working on fixing this, which involves some really old code. To convert it to seconds, simply multiple the fraction part by 86400.0. Xiaohua(Eric)

yunjunz commented 3 years ago

As for the yyyyddd, we would need to use it to convert to YYYYMMDD for DATE12 of each interferogram. What is your suggestion for now? Shall we wait for the fix from gmtsar, or implement an if check in the code in mintpy with an explicit list of satellites name involved?

Xiaohua-Eric-Xu commented 3 years ago

Hi @yunjunz, I am not sure when we'll be able to fully fix this as I did not write those codes and have to do some digging. So please use the list below for now. starting from 0, Sentinel-1, CSK, RS2, TSX starting from 1, ERS1/2, ALOS1/2, ENVISAT. I'll keep you posted when there's an update. Thanks, Xiaohua(Eric)

yunjunz commented 3 years ago

If we don't expect filter_wavelength is available always, a lazy way would be to guess it from A/RLOOKS: the impact on the weighting should be negligible.

For the orbdir and z_dot, could you describe more? Below is the LED file I have:

28 2004 114 65788 60 
2004 114 65788 +1155463.154 -0187023.247 +7057794.131 -4122.103661 -6293.229760 +0506.835159
2004 114 65848 +0904409.227 -0562917.422 +7074392.795 -4243.565409 -6232.346772 +0046.274288
2004 114 65908 +0646577.437 -0934401.479 +7063343.429 -4347.922857 -6146.298169 -0414.467378
2004 114 65968 +0383013.054 -1299979.513 +7024688.895 -4434.554195 -6035.568427 -0873.599962
...
Xiaohua-Eric-Xu commented 3 years ago

Good idea. Here's how we design the decimation or looks. The decimation will be an integer value that ensures 4 samples per filter_wavelength, unless it is forced in the config file. In other words, it is possible one filter data at 200m but sample 30m-ish. Anyways, as you said, it should not matter too much. As for the LED file. You may have saw the records are every minute. Though the data only spans like 10s, the orbit has 28 records which is half an hour. Thus at the start, it's around the polar region. Take the record in the middle might be the way to go. However, this may not be accurate if the images are at polar regions.

yunjunz commented 3 years ago

Thank you @Xiaohua-Eric-Xu for the explanation. Grabbing from LED seems overcomplicated, I suggest we shoot for this: if the script could find orbdir from PRM file, use it; otherwise return error and ask the user to specify it manually in the template file.

Let's start with ingesting the geocoded gmtsar products into mintpy. If all works fine, we could further discuss the geocoding format / lookup table from gmtsar, in another independent github issue preferably.

Xiaohua-Eric-Xu commented 3 years ago

Sounds good!

bjmarfito commented 3 years ago

Thank you for making this happen. I'm not able to give my contributions right now and for the past few months since I'm finishing my master's thesis.

yunjunz commented 3 years ago

Hi @Xiaohua-Eric-Xu, regarding the output of SAT_baseline, what are the following parameters?

B_perpendicular for the perpendicular baseline? baseline_start/center/end for the total baseline? What is alpha_start/center/end?

Xiaohua-Eric-Xu commented 3 years ago

Hi @yunjunz, Yes. B_perpendicular is perpendicular baseline. Baseline_start/center/end are length of baselines at the start, center and end of the image. alpha_start/center/end is the angle of the baseline. These are used when computing drho. There are another set of parameter called B_offset used to account for situations when two orbits are not parallel.

soroushyasini commented 7 months ago

hi, i am still searching for a straightforward approach for GMTSAR interferograms to MintPy time series. any suggestion?

Xiaohua-Eric-Xu commented 6 months ago

Have you tried what's suggested here? https://github.com/insarlab/MintPy/pull/547

soroushyasini commented 4 months ago

sorry for late response @Xiaohua-Eric-Xu actually yes. as Mintpy needs a baseline.txt file calculated from *.PRM files located in interferogram folder, i wrote a script which make baseline.txt by SAT_baseline.csh for each interferogram in intf_all directory. since GTMSAR does not make this file for each interferogram by default as far as i knew) , i am considering contribute to this project :) any suggestion ?

Xiaohua-Eric-Xu commented 3 months ago

That's cool if you are willing to help make a conversion script to calculate baseline info. Have you tried to directly calculate the baseline info from baseline_table.dat (you could easily create one with get_baseline_table.csh)?