Closed gerritholl closed 2 years ago
Just to get some clarification:
save_dataset
?PhysicUnit
supposed to change the metadata (the scale/factor) to reflect the desired unit or is it supposed to scale the data before writing to the image?Or is it just supposed to describe the data as it already is?
The units of the data in the Scene before save_dataset
are Kelvin. The user passes PhysicUnit
. As far as NinJo is concerned, PhysicUnit
should describe the unit of the quantities obtained after the offset and scale factor are applied. Currently, setting the PhysicUnit
in the ninjogeotiff writer only sets the PhysicUnit
header. It does not currently change the data or any other header. The result is that NinJo displays values that are Kelvin, but labels them as °C.
The old ninjotiff writer gets physic_unit
, and, apart from writing the value to the header, changes the data before applying any enhancement. However, it also took arguments for min and max value of the data that were used to write the headers. When using both crude scaling and passing min/max to the writer explicitly, the unit conversion had no effect. Therefore, I did not include it in the ninjogeotiff writer. However, it was up to the user to make sure that the values passed in crude scaling (in K) were consistent with the ones passed to the writer (in °C).
We went from doing both (redundantly) to neither (leading to wrong units). I don't think the writer should change the data. As long as the data are in K but should be interpreted in °C, there's no need to either; all we need to do, is that if a PhysicUnit
inconsistent with the data is passed (K to °C or the other way around), that the offset (in case of NinJo: AxisIntercept
) is adapted accordingly, such that the data interpretation is correct.
Understood. One hack and probably not best solution would be to add an enhancement_history
to the .attrs
before giving it to save_image
so that the final scale/offset results look like C data was crude stretched instead of Kelvin (since the XRImage combines all scale and offset before saving them to the geotiff).
Changing the enhancement_history
would be a hack. Is changing the pixel value the lesser evil? That's what the ninjotiff writer does.
Where would the pixel value be changed? In the ninjogeotiff writer?
Where would the pixel value be changed? In the ninjogeotiff writer?
Yes.
I suppose that could have some nasty effects if writing to multiple file formats, or doing anything else with the data.
Or one would have to copy the data, which is also not good.
Actually, that side effect could also happen if I change the enhancement history.
The latter side-effect is already happening today… see #2022
Modifying the data (the DataArray.data
dask array) should be fine as long as it isn't done in-place. As described in #2022, if it is done on the XRImage that should be fine as the DataArray is a copy. If it is done with the user provided DataArray then it would first need to be copied or the xarray operations on it would have to return a new DataArray (which would be a copy).
Although the fix as currently implemented, by changing the enhancement history, appears to work, there is a problem. To get an crudely stretched image that looks correct, the ninjogeotiff writer needs source units:
kwargs: {stretch: crude, min_stretch: 313.5, max_stretch: 186}
but the ninjotiff writer needs destination units:
kwargs: {stretch: crude, min_stretch: 40, max_stretch: -87}
I'm not sure I follow. What do you mean by source and destination units? The scale/factor values follow this path for a non-ninjo geotiff:
graph LR
src(Kelvin) --> enhance(Crude Normalization);
enhance --> dtype(File Data Type - uint8);
And yes I used this to learn github's support for graphs in markdown.
Or maybe it should be:
graph LR
src(Kelvin) -- crude stretch --> enhanced(Normalized 0-1);
enhanced -- crude stretch --> dtype(File Data Type - 0-255);
By source units, I mean K
. By destination units, I mean C
. In the implementation, I've used
data.attrs["enhancement_history"].insert(0, {"scale": 1, "offset": 273.15})
I don't exactly understand what's going on either. Some observations. Our previous/current ninjotiff production:
ch_min_measurement_unit
and ch_max_measurement_unit
to the writer in °C (those are used to set offset and scale factor);min_stretch
and max_stretch
to the enhancement configuration in K.Just now I switched from GenericCompositor
to SingleBandCompositor
, which broke our (development) production. The units attribute is no longer lost, so ninjotiff is changing the data before it's being stretched, and passing units of K
to the enhancer no longer works as desired (the image is all white).
As I've currently implemented the units conversion for ninjogeotiff in this PR, I'm not changing the data, but rather inserting an enhancement history entry in front of all enhancements (I found that appending it to the end didn't work as expected, although I don't exactly understand why). Images appear to be produced correctly when using a SingleBandCompositor. The ninjogeotiff writer does not take ch_min_measurement_unit
or ch_max_measurement_unit
arguments, as those should be redundant compared to the stretching. Setting offset and scale factor is delegated to the GeoTIFF writer, which I think calculates them from the enhancement history. With the crude normalisation parameters defined in K, the ninjogeotiff image and its metadata look correct.
I'm going to try to do the same as the ninjotiff writer now, so that we can use the same configuration for both.
It seems that when the data reach the ninjogeotiff
writer, they've already been normalised/stretched to be between 0 and 1, so subtracting 273 does not have the intended effect.
You are correct that the XRImage in trollimage (and the act of saving a geotiff) will combine the enhancement history's series of scale factor and offsets. So it does matter where in the history the new/fake entry is added. I'm not sure I follow in your comments what about the faked enhancement_history didn't work. Maybe it needs to be -273.15
instead of +273.15
for the offset? Not sure off the top of my head.
For a normal radiance/reflectance/BT product the image will be normalized in save_dataset
before being passed to save_image
. However, your .insert
should have caused this to be correct...I would think.
It seems that when the data reach the
ninjogeotiff
writer, they've already been normalised/stretched to be between 0 and 1, so subtracting 273 does not have the intended effect.
Ah, that's because I have implemented save_image
and not save_dataset
. I had a reason for that at the time, but I don't remember what it is. However, it does mean it is "too late" to change the values.
The faked enhancement_history
works. But the ninjotiff writer uses a different strategy by changing the data. The effect is that for the ninjogeotiff writer with the fake enhancement, the crude stretch parameters need to be in K. For the ninjotiff writer with the data changed before anything is enhanced, the crude stretch parameters need to be in °C. When we want to write both ninjotiff and ninjogeotiff, that's rather inconvenient.
So arguably the problem here is rather with ninjotiff. Just that we've never noticed before, because we accidentally deleted the units tag by using the GenericCompositor
.
Ah ok, I think I see what you're saying. I expected this though. The faked enhancement_history is to trick ninjo, not to trick satpy. The enhancement_history doesn't change anything about how future enhancements are applied, but it does change the tags that downstream applications (ninjo) will use.
Describe the bug
When writing single-channel terrestrial infrared images using the ninjogeotiff writer, the values as interpreted with scale factor and offset do not correspond to the units passed to the writer and written to the metadata. Even if
PhysicUnit
is set toC
, theAxisIntercept
is set such that the values correspond to K. In the (experimental) GeoTIFF import under development in NinJo, this leads to values being displayed wrongly, such as "273 °C" when that should be either "273 K" or "0 °C".To Reproduce
Then, with the resulting file:
Expected behavior
I expect the result of
gdalinfo ... | grep
A clear and concise description of what you expected to happen.
Actual results
The result of the
gdalinfo
command is actually:Environment Info:
Additional context
We identified this problem when experimenting with the new NinJo GeoTIFF import.