As mentioned in #41, proper image calibration needs to consider the bias and also dark current of an image. This is signal which isn't caused by light from the scene, so it offsets the brightness levels of an image. This especially manifests in reduced contrast and can also mess with the color balance in shadowed regions when applying color balance weights.
Subtracting bias will also ensure correct application of the flat frame since that is usually also corrected for this offset.
Since Bias for these cameras with the KAI-2020 chip isn't just defined as a static value as with the ECAMs and is also partially pre-subtracted on the rover (with the correction value unknown for the public raw images) it must be derived from the image itself:
The KAI-2020CM sensor has masked pixel regions to the left and right of the photosensitive pixels. The brightness can be measured and averaged over these pixels to calculate a good estimate for bias and dark current in the image.
As per the datasheet the manufacturer recommends only using the middle 14 pixels in this column because the others may be affected by light leak.
https://www.onsemi.com/pdf/datasheet/kai-2020-d.pdf
In case of subframed images with these areas cropped away a static estimate could be used, though some images, especially movie frames appear to already have some kind of black subtraction applied to them.
Adding this will ensure better calibration of the KAI-2020 based cameras.
As mentioned in #41, proper image calibration needs to consider the bias and also dark current of an image. This is signal which isn't caused by light from the scene, so it offsets the brightness levels of an image. This especially manifests in reduced contrast and can also mess with the color balance in shadowed regions when applying color balance weights. Subtracting bias will also ensure correct application of the flat frame since that is usually also corrected for this offset.
Since Bias for these cameras with the KAI-2020 chip isn't just defined as a static value as with the ECAMs and is also partially pre-subtracted on the rover (with the correction value unknown for the public raw images) it must be derived from the image itself:
The KAI-2020CM sensor has masked pixel regions to the left and right of the photosensitive pixels. The brightness can be measured and averaged over these pixels to calculate a good estimate for bias and dark current in the image. As per the datasheet the manufacturer recommends only using the middle 14 pixels in this column because the others may be affected by light leak. https://www.onsemi.com/pdf/datasheet/kai-2020-d.pdf
In case of subframed images with these areas cropped away a static estimate could be used, though some images, especially movie frames appear to already have some kind of black subtraction applied to them.
Adding this will ensure better calibration of the KAI-2020 based cameras.