Closed jkingslake closed 9 months ago
I am also wondering about the processing flow for polarimetry data. I was looking and couldn't find much but please point me toward it if it is already here or you are working on adding it.
Ultimately, I think it would be useful to have 2 separate options for processing polarimetry data. First, is very similar to the interferometry processing, would use the same coherence function like line 562 in ApRESDefs.py (Young et al., 2021; Ershadi et al., 2022). The second would be a co-registration method like Zeising et al. (2023).
I would love to contribute some code that I already have for this stuff if it is useful.
Hi @benhills ! Thanks for taking a look!
We are definitely interested in adding a polarmetric ApRES processing flow. I will start another issue for this as I just merged #20 which deals with the primary topic of this thread.
@Elizabethcase is planning on collating many measurements made at different places, and at each place there are multiple measurements each with a different antenna orientation.
The antenna orientation is encoded in the filenames of the dat files as
hh
,hv
,vh
andvv
. So the plan is to useApRESDefs.load_all()
to only load the files corresponding the hh, then hv, etc to get four xarrays and thenxr.concat
them together along a new dimension calledorientation
. The coordinates of that dimension could the stringhh
,hv
... But we will have to see how nicely that works with plotting etc, it might be easier to keep them as integers and have info in the attributes about what each integer means in terms ofhh
,hv
...The only issue is that the timestamps of the measurements taken with different orientations in the same place will all be slightly different, so when you concat along the
orientation
dim, the data wont all fall nicely along the time dimension.I guess one way around this is to adjust the times of the
hv
,vh
,vv
data to match thehh
times, before the concat step. We will have to take care to match them up based on being close in time and/or lat/lon first though.