pytroll / satpy

Python package for earth-observing satellite data processing
http://satpy.readthedocs.org/en/latest/
GNU General Public License v3.0
1.07k stars 297 forks source link

Wrong AxisIntercept (add_offset) when writing °C temperature units with ninjogeotiff writer #2018

Closed gerritholl closed 2 years ago

gerritholl commented 2 years ago

Describe the bug

When writing single-channel terrestrial infrared images using the ninjogeotiff writer, the values as interpreted with scale factor and offset do not correspond to the units passed to the writer and written to the metadata. Even if PhysicUnit is set to C, the AxisIntercept is set such that the values correspond to K. In the (experimental) GeoTIFF import under development in NinJo, this leads to values being displayed wrongly, such as "273 °C" when that should be either "273 K" or "0 °C".

To Reproduce

import os
from pyproj import CRS
from glob import glob
from satpy import Scene
from satpy.utils import debug_on; debug_on()
from sattools.io import plotdir
from pyresample import create_area_def
abi_east_files = glob("/media/nas/x21308/abi/ABI-L1B-RadF/2019/023/18/OR_ABI-L1b-RadF-M3C*_G16_s20190231800344_e*_c*.nc")
###
south4326 = create_area_def(
        "epsg4326south",
        CRS.from_epsg(4326),
        resolution=1/128,
        area_extent=(-90, -60, -30, -30),
        proj_id="EQC_PLAT")
sc = Scene(filenames=abi_east_files, reader=["abi_l1b"])
sc.load([10.3])
ls = sc.resample(south4326)
###
out = plotdir() / "{start_time:%Y%m%d%H%M}-{platform_name}-{sensor}-{area.area_id}-{name}-bug.tif"

ls.save_dataset(
        10.3,
        os.fspath(out),
        writer="ninjogeotiff",
        PhysicUnit="C",
        PhysicValue="Temperature",
        SatelliteNameID=0,
        ChannelID=0,
        fill_value=0,
        DataType="GORN",
        DataSource="unknown")

Then, with the resulting file:

gdalinfo 201901231800-GOES-16-abi-epsg4326south-C13-bug.tif | grep AxisIntercept

Expected behavior

I expect the result of gdalinfo ... | grep

  ninjo_AxisIntercept=-63.269654408728485

A clear and concise description of what you expected to happen.

Actual results

The result of the gdalinfo command is actually:

  ninjo_AxisIntercept=209.8803455912715

Environment Info:

Additional context

We identified this problem when experimenting with the new NinJo GeoTIFF import.

djhoese commented 2 years ago

Just to get some clarification:

  1. What are the units of the data in the Scene before save_dataset?
  2. Is PhysicUnit supposed to change the metadata (the scale/factor) to reflect the desired unit or is it supposed to scale the data before writing to the image?
djhoese commented 2 years ago

Or is it just supposed to describe the data as it already is?

gerritholl commented 2 years ago

The units of the data in the Scene before save_dataset are Kelvin. The user passes PhysicUnit. As far as NinJo is concerned, PhysicUnit should describe the unit of the quantities obtained after the offset and scale factor are applied. Currently, setting the PhysicUnit in the ninjogeotiff writer only sets the PhysicUnit header. It does not currently change the data or any other header. The result is that NinJo displays values that are Kelvin, but labels them as °C.

The old ninjotiff writer gets physic_unit, and, apart from writing the value to the header, changes the data before applying any enhancement. However, it also took arguments for min and max value of the data that were used to write the headers. When using both crude scaling and passing min/max to the writer explicitly, the unit conversion had no effect. Therefore, I did not include it in the ninjogeotiff writer. However, it was up to the user to make sure that the values passed in crude scaling (in K) were consistent with the ones passed to the writer (in °C).

We went from doing both (redundantly) to neither (leading to wrong units). I don't think the writer should change the data. As long as the data are in K but should be interpreted in °C, there's no need to either; all we need to do, is that if a PhysicUnit inconsistent with the data is passed (K to °C or the other way around), that the offset (in case of NinJo: AxisIntercept) is adapted accordingly, such that the data interpretation is correct.

djhoese commented 2 years ago

Understood. One hack and probably not best solution would be to add an enhancement_history to the .attrs before giving it to save_image so that the final scale/offset results look like C data was crude stretched instead of Kelvin (since the XRImage combines all scale and offset before saving them to the geotiff).

gerritholl commented 2 years ago

Changing the enhancement_history would be a hack. Is changing the pixel value the lesser evil? That's what the ninjotiff writer does.

djhoese commented 2 years ago

Where would the pixel value be changed? In the ninjogeotiff writer?

gerritholl commented 2 years ago

Where would the pixel value be changed? In the ninjogeotiff writer?

Yes.

I suppose that could have some nasty effects if writing to multiple file formats, or doing anything else with the data.

Or one would have to copy the data, which is also not good.

gerritholl commented 2 years ago

Actually, that side effect could also happen if I change the enhancement history.

gerritholl commented 2 years ago

The latter side-effect is already happening today… see #2022

djhoese commented 2 years ago

Modifying the data (the DataArray.data dask array) should be fine as long as it isn't done in-place. As described in #2022, if it is done on the XRImage that should be fine as the DataArray is a copy. If it is done with the user provided DataArray then it would first need to be copied or the xarray operations on it would have to return a new DataArray (which would be a copy).

gerritholl commented 2 years ago

Although the fix as currently implemented, by changing the enhancement history, appears to work, there is a problem. To get an crudely stretched image that looks correct, the ninjogeotiff writer needs source units:

kwargs: {stretch: crude, min_stretch: 313.5, max_stretch: 186}

but the ninjotiff writer needs destination units:

kwargs: {stretch: crude, min_stretch: 40, max_stretch: -87}

djhoese commented 2 years ago

I'm not sure I follow. What do you mean by source and destination units? The scale/factor values follow this path for a non-ninjo geotiff:

graph LR
  src(Kelvin) --> enhance(Crude Normalization);
  enhance --> dtype(File Data Type - uint8);

And yes I used this to learn github's support for graphs in markdown.

djhoese commented 2 years ago

Or maybe it should be:

graph LR
  src(Kelvin) -- crude stretch --> enhanced(Normalized 0-1);
  enhanced -- crude stretch --> dtype(File Data Type - 0-255);
gerritholl commented 2 years ago

By source units, I mean K. By destination units, I mean C. In the implementation, I've used

data.attrs["enhancement_history"].insert(0, {"scale": 1, "offset": 273.15})

I don't exactly understand what's going on either. Some observations. Our previous/current ninjotiff production:

Just now I switched from GenericCompositor to SingleBandCompositor, which broke our (development) production. The units attribute is no longer lost, so ninjotiff is changing the data before it's being stretched, and passing units of K to the enhancer no longer works as desired (the image is all white).

As I've currently implemented the units conversion for ninjogeotiff in this PR, I'm not changing the data, but rather inserting an enhancement history entry in front of all enhancements (I found that appending it to the end didn't work as expected, although I don't exactly understand why). Images appear to be produced correctly when using a SingleBandCompositor. The ninjogeotiff writer does not take ch_min_measurement_unit or ch_max_measurement_unit arguments, as those should be redundant compared to the stretching. Setting offset and scale factor is delegated to the GeoTIFF writer, which I think calculates them from the enhancement history. With the crude normalisation parameters defined in K, the ninjogeotiff image and its metadata look correct.

gerritholl commented 2 years ago

I'm going to try to do the same as the ninjotiff writer now, so that we can use the same configuration for both.

gerritholl commented 2 years ago

It seems that when the data reach the ninjogeotiff writer, they've already been normalised/stretched to be between 0 and 1, so subtracting 273 does not have the intended effect.

djhoese commented 2 years ago

You are correct that the XRImage in trollimage (and the act of saving a geotiff) will combine the enhancement history's series of scale factor and offsets. So it does matter where in the history the new/fake entry is added. I'm not sure I follow in your comments what about the faked enhancement_history didn't work. Maybe it needs to be -273.15 instead of +273.15 for the offset? Not sure off the top of my head.

djhoese commented 2 years ago

For a normal radiance/reflectance/BT product the image will be normalized in save_dataset before being passed to save_image. However, your .insert should have caused this to be correct...I would think.

gerritholl commented 2 years ago

It seems that when the data reach the ninjogeotiff writer, they've already been normalised/stretched to be between 0 and 1, so subtracting 273 does not have the intended effect.

Ah, that's because I have implemented save_image and not save_dataset. I had a reason for that at the time, but I don't remember what it is. However, it does mean it is "too late" to change the values.

The faked enhancement_history works. But the ninjotiff writer uses a different strategy by changing the data. The effect is that for the ninjogeotiff writer with the fake enhancement, the crude stretch parameters need to be in K. For the ninjotiff writer with the data changed before anything is enhanced, the crude stretch parameters need to be in °C. When we want to write both ninjotiff and ninjogeotiff, that's rather inconvenient.

gerritholl commented 2 years ago

So arguably the problem here is rather with ninjotiff. Just that we've never noticed before, because we accidentally deleted the units tag by using the GenericCompositor.

djhoese commented 2 years ago

Ah ok, I think I see what you're saying. I expected this though. The faked enhancement_history is to trick ninjo, not to trick satpy. The enhancement_history doesn't change anything about how future enhancements are applied, but it does change the tags that downstream applications (ninjo) will use.