Closed aladinor closed 1 year ago
@aladinor Yes, that's a bug. @JulianGiles found this behaviour on iris data, too. We will add some context the next days.
But you're already quite close. It's the jittering angles which makes the current retrieval of angle spacing less robust.
So, there is no good solution to this issue. The only thing we can do is to remove/adapt the one quantile thing here:
The reasoning behind the current setup was to remove glitches (double rays) with very small angle differences. But this is worse for datasets without these small differences angles, since it shifts the median to higher differences. I'm thinking about having a quantile
-kwarg to remove differences below and above a certain treshold before calculation of median.
Nevertheless, if the angle differences are jittering too much, we can't do anything about it. Here I'm thinking about checking std/var to get an impression if the provided angles are spread to wide and issue a warning to the user.
@aladinor @mgrover1 I'd like to get your thoughts on that, too.
Here is how I would do it now:
import numpy as np
import xarray as xr
import xradar as xd
seed = 12345
rng = np.random.default_rng(seed)
#noise = rng.normal(scale=0.11, size=360)
noise = rng.uniform(low=-0.25, high=0.25, size=360)
print(f"Noise added, min: {min(noise)}, max: {max(noise)}")
azimuth = np.arange(0.5, 360., 1.0) + noise
ds = xd.model.create_sweep_dataset(
a1gate=0, direction=1, azimuth=azimuth
)
ds = ds.swap_dims({"time": "azimuth"})
ds = ds.sortby("azimuth")
ds = ds.assign_coords(sweep_mode="azimuth_surveillance")
# remove block of rays
ds_in = xr.concat(
[
ds.isel(azimuth=slice(None, 100)),
ds.isel(azimuth=slice(101, None)),
],
"azimuth",
data_vars="minimal",
)
print(ds)
# extract angle resolution
first_angle = "azimuth"
fdim = ds_in[first_angle]
diff = fdim.diff(first_angle)
#print(diff.values)
# calculate std/median
std = diff.std(skipna=True)
median = diff.median(skipna=True)
print(f"STD: {std.values}, MEDIAN: {median.values}")
# remove values above and below std (centered around median), calculate median
diff_cutted = diff.where((diff >= (median - std)) & (diff <= (median + std)))
print(diff_cutted.values)
median_diff = diff_cutted.median(skipna=True)
# angle res rounding to two decimals (should work for almost all cases)
angle_res = np.round(median_diff, decimals=2)
print(f"MEDIAN: {median_diff.values}, ANGLE_RES: {angle_res.values}")
@aladinor I've added a fix in #118 which should work for most of the use cases. Only for very random angle data this is starting to fail.
Hi @kmuehlbauer. Thanks for looking at this issue. I will test it when it is merged.
Hi everyone,
I am trying to apply the angle reindexing function to my dataset as is explained here. the radar data I am analyzing is 360 degrees in azimuth as follows
The datatree object is
360
angles in azimuth. Then, I try to reindex the azimuth angles using thefix_angle
functionThe result looks like this,
Now our new xradar object has 356 angles in the azimuth dimension
Digging into util.py seems like in the line #L291 the angle resolution is affected by the number of decimals of the
np.round
method.If we change the number of decimals to 1 the output will look like this:
Please let me know your thoughts.