Open oguzhannysr opened 1 month ago
First, comparing different intervals is not a good idea due to the potential influence of seasonal factors. Second, it is crucial to remove orbital ramps, atmospheric phase delays, and tidal effects to avoid overestimating displacements. Gaussian filtering can indeed be used to eliminate these effects, but the filter wavelength (and corresponding sigma) must be carefully adjusted to preserve the actual deformation.
@AlexeyPechnikov , What exactly is the filter wavelength? Is it the one you mentioned in the code below? I set it to 30 because of the spatial resolution.
sbas.compute_interferogram_multilook(baseline_pairs, 'intf_mlook', wavelength=30, weight=sbas.psfunction())#30
@AlexeyPechnikov ,Also, I get an error like this in the last section, the space is actually quite small? I tried restarting the client and it didn't work. Can I overcome this problem if I use Colab Pro? Is it related to this?
@AlexeyPechnikov ,For example, I tried narrowing the date range a little more and it works now, but for a small area, it exports each date to almost 30 minutes. How can I speed this up?
What exactly is the filter wavelength? Is it the one you mentioned in the code below? I set it to 30 because of the spatial resolution. sbas.compute_interferogram_multilook(baseline_pairs, 'intf_mlook', wavelength=30, weight=sbas.psfunction())
This code is for interferogram creation with Gaussian filtering. But I talk about Gaussian detrending after unwrapping.
Also, I get an error like this in the last section, the space is actually quite small?
Check if you have materialized grids or lazy ones, in the last case a long lazy pipeline can fail even on a high RAM hosts.
@AlexeyPechnikov , Alexey, regarding the wavelength selection, I think you are talking about the line in the Imperial Valley example below. Would it be useful to set the wavelength as 100 based on my 30 meter spatial resolution?
# Gaussian filtering 400m cut-off wavelength with multilooking 1x4 on Sentinel-1 intensity
intensity = sbas.multilooking(np.square(np.abs(data)), wavelength=400, coarsen=(1,4))
How can I check if there are grids or lazy ones? I don't know Dask.
Absolutely not, this is not 'Stack.gaussian' function call.
Use 'sync' functions for every step as in large dataset examples if you are not sure.
First, comparing different intervals is not a good idea due to the potential influence of seasonal factors.
@AlexeyPechnikov ,I don't understand what you mean here. I want to measure the cumulative deformation since 2016, is it seasonally inaccurate?
As mentioned above:
while the deformation in 4 months with PYGMTSAR was 238 mm
This time frame is too short for cases with significant seasonal changes. However, the magnitude of the change is large enough that it is likely due to unremoved tidal effects or atmospheric phase delays.
@AlexeyPechnikov ,Thank you Alexey. I still haven't solved my problem with Dask, how can I speed up the export process? Because Google Colab stops after a certain period of time. Is it useful to run pygmtsar with colab gpu, in terms of RAM?
Since these lines take a long time, I start my notebook by commenting these lines. Is this why the export process takes so long?
# optionally, materialize to disk and open
#stl_sbas = sbas.sync_cube(stl_sbas, 'stl_sbas')
how can I speed up the export process?
You can export materialized datasets that are synced to disk.
@AlexeyPechnikov , Alexey I got my results in the same area for both orbits. But I couldn't decide which one to use. How is it correct to determine this? I don't know much about the land. I am trying to make an analysis about the runway and taxiways in the middle. I'm investigating which part has sunk or risen more.
Descending:
Ascending:
airport:
airport-desc:
airport-asc:
The results look noisy. Have you performed atmospheric corrections and other processing steps? You can combine both orbits to derive vertical and horizontal displacements.
@AlexeyPechnikov, Yes, I followed the steps in the otmanbozdagh notebook. I think I applied it in the turbulent atmospheric effects section. I'm undecided on which orbit result to consider. How can I separate vertical and horizontal displacements? Is this possible in pygmtsar? I couldn't see any examples.
@AlexeyPechnikov , Alexey, I'm waiting for your comments and help...
If you’ve applied turbulent atmospheric correction and still see noisy results, it usually means your interferograms are too noisy. You can filter out the noisiest ones or unwrap only the best-correlated pixels to achieve numerical stability. There are two documents on my Patreon that may help in selecting the proper processing parameters and estimating accuracy: “Baseline Networks for PS and SBAS Analyses, 2024” and “Residuals of Topographic Phase and Constant Phase Delays.”
@AlexeyPechnikov , What threshold should I apply to the coherence value to select these pixels? Is there a threshold value you recommend or does pygmtsar provide an opportunity to determine it?
Use the stack correlation map to estimate the correlation threshold, as demonstrated in the provided examples.
@AlexeyPechnikov , Alexey, I have a different question: How consistent would it be to use the PSI method in actively working garbage areas? As far as I know, is PSI available in pygmtsar?
@AlexeyPechnikov , Hello Alexey, as seen in the image, you see LOS direction deformations with SBAS analysis. This is a landfill. It is normal for the deformation to be large due to activity. However, the values are extremely high in a very short time period. What is the cause of the problem? There are no atmospheric problems because they have been eliminated. How consistent would it be if I did Gaussian filtering? I had obtained very high deformation values in my previous experiments with notebooks. When I tried the same region for LICSBAS, the maximum value for the cumulative LOS value I got between 2016-2024 was 160 mm, while the deformation in 4 months with PYGMTSAR was 238 mm?