Closed fanckush closed 5 years ago
How does anyone (photoshop, raw digger, our cameras..) know if an exposure value V at a specific pixel (X, Y) is considered saturated (a burned highlight) or not?
In RawTherapee, which does not use LibRaw, we measure raw files based on a set of clipped white-frames and register the white levels in a JSON file which RawTherapee checks every time it loads a raw file, see:
HDRMerge uses LibRaw and here things are different. From my very limited understanding, LibRaw reports the maximum possible value as supported by the raw format, regardless whether the actual values reach that level or are even capable of reaching that level, I suppose leaving it up to the calliung program (HDRMerge) to figure out what the real white levels are.
Don't cameras have a brightness value cap that they can't record beyond?
Yes, a point at which a photosite saturates, but with other possible complications - see the long intro in the camconst.json
file.
I gotta say, this is what just happened to me now Mind Blow. I can't believe the struggle and different approaches for something I took for granted all this time. Even displaying a Raw image on a screen is a challenge!
Thanks a lot
@fanckush Maybe related: https://github.com/jcelaya/hdrmerge/issues/126
Anyone know the reason behind this
https://github.com/jcelaya/hdrmerge/blob/3ad6d36c517ddb7f164799c98800bdccb2e5b9f2/src/ImageStack.cpp#L92
In previous commits, the division was by 1000
, any math behind this or it just arbitrarily?
@heckflosse Thanks. looks useful. It's probably safe to assume that all bugs with highlights and noisy data are caused by the way saturation is calculated.
And for reference, rawspeed/darktable do the same as RawTherapee, i.e. measure the white levels 'manually' (or look them up in metadata if available, e.g. for DNG): https://github.com/darktable-org/rawspeed/blob/develop/data/cameras.xml
I finally understand some stuff better now. Seams like I'm late to the game and you guys already went over this whole thing.
LibRaw's params.color.max
seams to be 16383
. As far as I understand, this value is hardcoded into LibRaw and is not dependent to the loaded raw image.
I only have a patchy idea in mind and that is to take this max value as an absolute (even if the camera will NEVER reach that value) and make sure the computed satThreshold
is not smaller than 80% of that max.
@fanckush see also this issue for insight and surprises: https://github.com/LibRaw/LibRaw/issues/82
Hey guys! it's been a while since my last commit, I've been busy with.. life but I really want to get back to this and resolve everything needed to release V1.0 I guess I'm just saying I didn't give up on this 😅
All questions I had are clean now. Closing for now. If anyone in the future has any questions to the topic I'll be glad to explain
This may not be the best place to post this question but it's at least relevant to how saturation levels are computed in HDRMerge.
I wanted to contribute to this project so I've been looking into the code and trying to understand it better. One part I found particularly strange was: https://github.com/jcelaya/hdrmerge/blob/3ad6d36c517ddb7f164799c98800bdccb2e5b9f2/src/ImageStack.cpp#L54 If I understand this correctly, then it's a problem. The brightest image in a set will always be considered over exposed even when it's not.
Of course I was clueless about EVERYTHING, for example I didn't know that Raw images only store brightness values. HDRMerge took the easy approach of assuming there is always over-saturation (over exposed pixels).
Now my questions are:
How does anyone (photoshop, raw digger, our cameras..) know if an exposure value
V
at a specific pixel(X, Y)
is considered saturated (a burned highlight) or not?Is it always just an approximation? in that case does each software compute the threshold differently?
Don't cameras have a brightness value cap that they can't record beyond? If so, then we could set the saturation threshold properly knowing that for example 6845839 is the max brightness of camera C
I am probably mixing the words "exposure", "brightness", "white level" and "saturation" together. What I am ultimately referring to is the registered value from the sensor at any pixel in any space.