Closed jenshannoschwalm closed 11 months ago
Interesting issue, could you give me a pointer or two? I see that hdrmerge simply copies the exif data "as is" but what is the correct expected behaviour here? should we perhaps simply not copy the BadPixel item?
I am really not an expert on this topic. But from what i understand in the dng specs i would think that probably all "jobs" in the opcode lists 1-3 should not be done after merging.
Lens correction, vignetting, warping, gain correction ... are imho not valid any more.
I took a loot at hdrmerge code. For what i see you don't correct any BadPixels in the originals before you do the merging. This is definitely a problem for images from cameras that rely on postprocessing and don't do it themself.
So the merged image has less quality than could be achieved. Anyway - the BadPixels information is not valid after merging at all so it must be excluded. Warp and vignetting can probably stay as this is not valid on pixel level.
Yup, the merged image doesn't correct anything. The metadata is simply copied from one of the source files to the output merged DNG. so if I understand this correctly:
If I got it right. then I'll start by excluding the BadPixels info 👍
@fanckush Why not implementing the best option? I would contribute in that case. We could use the badpixel interpolation code from RawTherapee. Then the remaining part would be to get the coordinates of the bad pixels from OpcodeList1. What do you think?
@heckflosse if you know how to do that it would be great! I simply lack the knowledge about BadPixels so I went with the simpler option
@fanckush Can you provide an example dng (an original, not a hdrmerged one) with badpixel info?
https://github.com/darktable-org/darktable/issues/3510 Could be a starting point including 5 images ...
from darktable-org/darktable#3510 you can download the original DNGs: https://drive.google.com/drive/folders/1Oec8SjmgBRmHUPnAwKqfoAioHQ_rr_vq
Ah, that one. I already wondered why dt shows this artifacts, while rt (which does not correct bad pixels for this file) works fine.
To me, it looks more like a bug in dt (maybe the interpolation code in dt does not work correctly on floating point dng files)
Anyway, it would be good, if hdrmerge could correct bad pixels before merge, because bad pixels also may confuse the align process for example.
Currently there are 4 cases:
1) Cameras which don't give any info about the bad pixels 2) Cameras (mainly Panasonic and Leica) which mark bad pixels with a zero 3) DNG files with a bad-pixel list 4) DNG files with a bad-pixel constant (bad pixels are marked with a constant value)
for 1) we could use the badpixel detection code from RT https://github.com/Beep6581/RawTherapee/blob/dev/rtengine/badpixels.cc#L477 for 2) and 4) we just have to check the raw data for a constant value for 3) we have to read the list from the DNG tag
for all of them we can use the interpolation code from RT for bayer https://github.com/Beep6581/RawTherapee/blob/dev/rtengine/badpixels.cc#L66 or xtrans https://github.com/Beep6581/RawTherapee/blob/dev/rtengine/badpixels.cc#L288
I already wondered why dt shows this artifacts, while rt (which does not correct bad pixels for this file) works fine. To me, it looks more like a bug in dt (maybe the interpolation code in dt does not work correctly on floating point dng files)
@heckflosse In dt the pixel artefacts go away if i disable dt's bad pixels correction. Maybe the interpolation code is broken although so far i am not aware of any original file that shows such problems. Also i can't see any problem on the original files of the above issue. Do you have an example file for me to look deeper into this?
Also, i don't know rt code at all, could you explain in short, why rt does not "correct bad pixels for this file"? Is there a specific reason for not doing so?
I haven't looked at either of code bases but I think (and I'm only guessing here) that RT detects bad pixel data then interpolates them without taking OpcodeList1
into consideration, while DT follows the OpcodeList1
and then interpolates according to it.
This means that HDR DNGs from HDRMerge with faulty OpcodeList1
are irrelevant in RT but very important in DT. please correct me if I'm wrong but if that is the case then I think HDRMerge is responsible of either:
BadPixels
metadata info for raw editors/viewers that may rely on it.@fanckush
RT detects bad pixel data then interpolates them without taking OpcodeList1 into consideration
Partly correct. For the example file I tested without bad pixel detection/interpolation and could not see any relevant artifacts caused by bad pixels in RT. Currently RT support only Opcode FixBadPixelsConstant
from OpcodeList1 and that only for the case that its value is zero.
while DT follows the OpcodeList1 and then interpolates according to it.
Don't know. If it interpolates, it should not generate such strange artifacts even if the OpcodeList1 contains data (coordinates) which do not match perfectly the generated float dng file.
Imho the only correct way to handle badpixels in hdrmerge is before merge. For that we can use code from RT and probably from DT (if DT has code for FixBadPixelsList
Opcode).
Hi, any news on this? I've now noticed the same artefacts showing with PhotoFlow app:
Not from my side ...
Photoflow also uses rawspeed as the decoder so the wrong pixels also showup there. I looked through rawspeeds coreection code and so far could not find a related bug. I am pretty sure the data for the opcode list is simply wrong. (RT does not even try to correct those so it won't show)
I guess it's really up to hdrmerge...
Why don't you compile your own hdrmerge and remove the exif data as suggested, that works just fine. Hanno Am Montag, den 06.01.2020, 05:57 -0800 schrieb Kristijan Žic:
Hi, any news on this? I've now noticed the same artefacts showing with PhotoFlow app:
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
To me it looks like a bug in Photoflow/dt. Why should a simple pixel interpolation produce such colourful artifacts of the sorrounding while the non corrected result of RT does not show the artifacts?
I'm not a C++ dev so I won't compile my own version for now but I'll just leave the steps that I used for your workaround here for anyone else with this issue until it gets a fix: And yeah, just to confirm that removing OpcodeList1 seems to a solution for me when editing hdrmerge HDRs images with Darktable.
install exiftool with:
sudo apt install libimage-exiftool-perl
Remove OpcodeList1 from the image metadata with (replace <FILE>
with the name of your image):
exiftool -OpcodeList1= <FILE>
To save time and do this to an entire directory of images do:
exiftool -OpcodeList1= *
When trying to edit HDRMerged image in Lightroom, it throws an error: "The file appears to be unsupported or damaged" and all the tools are disabled (grayed out and can't be manipulated)
Here is the screenshot of the errored image:
When removing OpcodeList1 from the metadata of the same HDRMerged image everything works fine and the image loads with and a bit less exposure:
The opcode1 list contains more information than what is interpreted & corrected by dt or rt. As said, that should be corrected in any raw reader and should not be passed to an image written.
For this case i am not sure about the origin; is this buggy list already in the source images?
@jenshannoschwalm these are the source images:
https://drive.google.com/open?id=1Oec8SjmgBRmHUPnAwKqfoAioHQ_rr_vq They all open fine in any app.
Please have a look at https://github.com/darktable-org/darktable/issues/3510
In short: A hdrmerged dng file shows single-dot artefacts when using daktable, issue is not to be seen in rt. Could be tracked down to BaxPixel handling. The OpcodeList1 seems to be 'just copied' from the original dng file and that seems to be wrong for me.
I have never used dhrmerge but while debugging dt i ran into this problem. @KristijanZic opened the issue there