jcelaya / hdrmerge

HDR exposure merging
http://jcelaya.github.io/hdrmerge/
Other
363 stars 78 forks source link

More noisy data appear in place of perfectly well-exposed pixels #173

Open 10110111 opened 5 years ago

10110111 commented 5 years ago

I was trying HDRMerge with three files, of which the one with longest exposure is actually sufficient: it has no overexposures and, due to having largest signal, has smallest SNR. But for some reason HDRMerge decides to place some underexposed data in place of the brightest parts of the image, resulting in very visible noise which stops outside of the bright part of the image.

Here's a link to the example three bracketed photos: Google Drive link. Steps to reproduce the problem:

  1. Run HDRMerge
  2. Uncheck all the checkboxes (alignment, cropping and custom white level)
  3. Press Add, choose the three photos (I used Ctrl+A to select them, they got listed in alphabetical order in File name field)
  4. Open, Accept, FileSave HDR..., Save

Resulting image: Google Drive link.

fanckush commented 5 years ago

Logically, the job of HDR is to take the bright parts from the lowest exposed image. And that is what is happening here: Screenshot 2019-05-11 at 12 57 20

I'm curious. What is the expected behaviour from a situation like this where the brightest image is not actually over exposed. Should HDRMerge ignore the other images? since they don't provide much.

btw HDRMerge gave the images the following EVs: +0 EV (the darkest image) +2.30 EV (middle) +4.60 EV (somewhat normal exposure) Could that be the reason? maybe it's considering the brightest image over exposed when it's not. a solution would be to see how the EV is calculated?

10110111 commented 5 years ago

the job of HDR is to take the bright parts from the lowest exposed image

Actually no, the job of HDR is to choose the "best" data from the available ones. "Best" means valid (not clipped) and having highest SNR from the valid samples available.

What is the expected behaviour from a situation like this where the brightest image is not actually over exposed. Should HDRMerge ignore the other images?

Yes, the lower-exposed images should be ignored, since the best data are all in the brightest image in this case.

maybe it's considering the brightest image over exposed when it's not. a solution would be to see how the EV is calculated?

Not sure what EV has to do with checks for overexposure. The correct way to determine whether a (sub)pixel is overexposed is to compare its raw value to the maximum value for the camera which took the photo.

fanckush commented 5 years ago

I haven't checked the code but the calculated EV value seems to be off meaning that possibly the code is not measuring/detecting overexposure properly when deciding on which pixels from which layer to use

fanckush commented 5 years ago

Can you try this patch?

index 6074321..81d6c06 100644
--- a/src/ImageStack.cpp
+++ b/src/ImageStack.cpp
@@ -90,6 +90,7 @@ void ImageStack::calculateSaturationLevel(const RawParameters & params, bool use
     }

     const size_t threshold = width * height / 10000;
+    const uint16_t min_saturation_threshold = params.max * 0.80;
     uint16_t maxPerColor[4] = {0, 0, 0, 0};

@@ -112,6 +113,8 @@ void ImageStack::calculateSaturationLevel(const RawParameters & params, bool use
     }

     if (!useCustomWl) { // only scale when no custom white level was specified
+        // assume default saturation in case of dark images
+        satThreshold = std::max(satThreshold, min_saturation_threshold);
         satThreshold *= 0.99;
     }

Seems to perform better with your set of pictures.

10110111 commented 5 years ago

Although it doesn't apply to current master, I've applied it manually, and it does seem to give a good result.