Open 10110111 opened 5 years ago
Logically, the job of HDR is to take the bright parts from the lowest exposed image. And that is what is happening here:
I'm curious. What is the expected behaviour from a situation like this where the brightest image is not actually over exposed. Should HDRMerge ignore the other images? since they don't provide much.
btw HDRMerge gave the images the following EVs: +0 EV (the darkest image) +2.30 EV (middle) +4.60 EV (somewhat normal exposure) Could that be the reason? maybe it's considering the brightest image over exposed when it's not. a solution would be to see how the EV is calculated?
the job of HDR is to take the bright parts from the lowest exposed image
Actually no, the job of HDR is to choose the "best" data from the available ones. "Best" means valid (not clipped) and having highest SNR from the valid samples available.
What is the expected behaviour from a situation like this where the brightest image is not actually over exposed. Should HDRMerge ignore the other images?
Yes, the lower-exposed images should be ignored, since the best data are all in the brightest image in this case.
maybe it's considering the brightest image over exposed when it's not. a solution would be to see how the EV is calculated?
Not sure what EV has to do with checks for overexposure. The correct way to determine whether a (sub)pixel is overexposed is to compare its raw value to the maximum value for the camera which took the photo.
I haven't checked the code but the calculated EV value seems to be off meaning that possibly the code is not measuring/detecting overexposure properly when deciding on which pixels from which layer to use
Can you try this patch?
index 6074321..81d6c06 100644
--- a/src/ImageStack.cpp
+++ b/src/ImageStack.cpp
@@ -90,6 +90,7 @@ void ImageStack::calculateSaturationLevel(const RawParameters & params, bool use
}
const size_t threshold = width * height / 10000;
+ const uint16_t min_saturation_threshold = params.max * 0.80;
uint16_t maxPerColor[4] = {0, 0, 0, 0};
@@ -112,6 +113,8 @@ void ImageStack::calculateSaturationLevel(const RawParameters & params, bool use
}
if (!useCustomWl) { // only scale when no custom white level was specified
+ // assume default saturation in case of dark images
+ satThreshold = std::max(satThreshold, min_saturation_threshold);
satThreshold *= 0.99;
}
Seems to perform better with your set of pictures.
Although it doesn't apply to current master, I've applied it manually, and it does seem to give a good result.
I was trying HDRMerge with three files, of which the one with longest exposure is actually sufficient: it has no overexposures and, due to having largest signal, has smallest SNR. But for some reason HDRMerge decides to place some underexposed data in place of the brightest parts of the image, resulting in very visible noise which stops outside of the bright part of the image.
Here's a link to the example three bracketed photos: Google Drive link. Steps to reproduce the problem:
Resulting image: Google Drive link.