After I've done incoherent time averaging on a UVPspec object, my blts axis is the size of the number of baselines (each of which has one "time"). However, if not all the times are the same, spherical averaging doesn't combine them correctly. As an example, if I do this:
uvp_tavg = copy.deepcopy(uvp_tavg_1_to_5)
sph_avgs = []
for spw in tqdm(uvp.spw_array):
dk_multiplier = 2.0 # the size of each spherical k bin (Delta k) when multiplied by the natural k_para spacing
k_start_multiplier = .75 # the center of first spherical k bin in units of Delta k
dk = dk_multiplier * np.median(np.diff(uvp_tavg.get_kparas(spw)))
kbins = np.arange(k_start_multiplier * dk, 2.5, dk) # even spacing
uvp_tavg_this_spw = uvp_tavg.select(spws=[spw], inplace=False)
sph_avgs.append(hp.grouping.spherical_average(uvp_tavg_this_spw, kbins, dk, error_weights='P_N'))
print(uvp_tavg.data_array[spw].shape, sph_avgs[-1].data_array[0].shape)
There should probably be an option on hp.grouping.spherical_average to handle this case, even if it's not the default behavior. Some sort of assume_time_averaged=True...
After I've done incoherent time averaging on a UVPspec object, my blts axis is the size of the number of baselines (each of which has one "time"). However, if not all the times are the same, spherical averaging doesn't combine them correctly. As an example, if I do this:
I get
The number of blts is the number of unique baselines (801), but it doesn't reduce to 1... it reduces to 30.
Probably related,
np.unique(uvp_tavg_1_to_5.time_1_array).shape = (15,)
However, if I force the integrations to have the same time in a rather hacky way, by doing the following:
I get
which is what I expect. It also goes much faster.
There should probably be an option on
hp.grouping.spherical_average
to handle this case, even if it's not the default behavior. Some sort ofassume_time_averaged=True
...