I have came across the case that writing None into HDF5's dataset causes an error.
This can be a problem when processing old S1 data because they do not have some values in noise annotation (e.g. azimuth_first_azimuth_line). BurstNoise class fills those missing values as None, and corrections_to_h5group() in h5_helpers.py would attempt to write None into the output HDF5 file.
This fix will write np.nan instead of None to avoid the error. Also the descriptions for the corresponding metadata are modified accordingly.
I have came across the case that writing
None
into HDF5's dataset causes an error.This can be a problem when processing old S1 data because they do not have some values in noise annotation (e.g. azimuth_first_azimuth_line).
BurstNoise
class fills those missing values asNone
, andcorrections_to_h5group()
inh5_helpers.py
would attempt to writeNone
into the output HDF5 file.This fix will write
np.nan
instead ofNone
to avoid the error. Also the descriptions for the corresponding metadata are modified accordingly.